OSINT Tool Tuesday - Instagram, Dark Web, Google Search
5 min read

OSINT Tool Tuesday - Instagram, Dark Web, Google Search

OSINT Tool Tuesday is a weekly series discussing valuable tools for OSINT investigations. This week discusses tools for Instagram, the dark web, and Google Search.
OSINT Tool Tuesday - Instagram, Dark Web, Google Search

Another week, another set of tools. This week we'll be looking at tools for investigations using Instagram, a CLI tool for navigating the dark web, and a browser extension to enhance your Google searches. The goal is to expand your mindset on what's possible using tools, increase your efficiency, and amplify your outcomes.

Make sure to subscribe to get the latest and greatest tools every Tuesday!

Instagram Scraper

Using a Python script can be incredibly helpful when it comes to social media data collection. Using a web app or a browser extension requires using valuable space in your workflow. Additionally, they are often not as quick or reliable as a Python script.

Recently I've been using Instagram Scraper by arc298 for social media investigations that require the full extraction of an Instagram profile. I like it because it's very easy to get up and running, especially if you find yourself frequently on different machines. Additionally, it's very user friendly. Once you install the package, you can begin scraping with a single, easy to understand command. Finally, it's fast and it can operate in the background without supervision. I stress tested it against Instagram's own Instagram profile. It was able to download 2450 pieces of content in only a couple of minutes.

Instagram Scraper also allows you to upload usernames in bulk. You can either have it read a TXT file and iterate through the list or you can do a few at a time separated by a comma in the command line. This takes it into the set it and forget it category of tools that can greatly increase your efficiency with OSINT.

You can also use Instagram Scraper to download images searching with hashtags. This can be extremely helpful for documenting images for trending hashtags of interest. It can also be helpful from a research standpoint to classify images based on a certain topic for applications in computer vision.

There's still more value here, believe it or not. Instagram Scraper also lets you scrape content based on location IDs. So if you were doing geolocation-specific work, you could begin collecting data related to a specific hashtag in a certain location very quickly. Whether it's a short term investigation or a long term study, this tool is an absolute must in your toolkit.

Scrapes an instagram user’s photos and videos. Contribute to arc298/instagram-scraper development by creating an account on GitHub.
Instagram Scraper by arc298

OSINT_Tools - Shandymen

Let's talk about OSINT for the dark web and a bit about forensics like metadata. Use of the dark web seems to grow every day. This means that, much like the surface web, the amount of data available is growing exponentially yet the time expectations of collection are shrinking. Automation is the solution in this case and Shandymen created a tool called TorGrab that takes a query and scans the dark web for the related .onion sites giving you an easy to manage CSV at the end.

I was really blown away by TorGab and how easy it was to use. You simply clone the repo, ensure you have all of the dependencies installed, and run your first query. The ReadMe page doesn't give a lot of instruction, so use this as quick installation and startup guide.

git clone https://github.com/Shandymen/OSINT_Tools.git

pip install bs4 requests

python TorGrab.py <keyword>

I searched for the keyword "drugs". I quickly got 10 onion URLs as well as when they were last active.

Screenshot from terminal using TorGrab.py

I kept seeing "Scanning Ahmia for: <keyword>" in my terminal so I decided to look further at what that could be. It led me to a site called ahmia.fl, a search engine for hidden services on the dark web. It seems TorGrab.py is a wrapper for that website. That's not necessarily a bad thing. Once again, a great way to save browser space.

A Repo for in-house Scripts developed by the Shandymen. - Shandymen/OSINT_Tools
Dark Web and Hash OSINT Tools by Shandymen

Google Search by Time/Date

The final tool is a browser extension that expedites pivoting based on time when doing advanced searches on Google for OSINT. Typically if you want to find content within a certain date range, you have to complete your query then filter based on a specific date range or filter. Google Search by Time/Date puts that filter at the top of your search results which will save you a few seconds on each of your queries. It doesn't sound like a lot, but over the course of a year it could save you hours. Furthermore, it's a passive extension which means you don't have to pin it to your extension bar on Chrome which is valuable real estate these days.

Google Search by Time/Date in the Chrome Web Store

Google Search by Time/Date also forces your brain to consider date and time when doing your searching. I'm often susceptible to trusting the first and second page of Google search results. By being persistently reminded of time as the first thing you read in your search results, you're more likely to shift your mindset and adopt a best practice.

Google Search by Time/Date Activated

I went ahead and made sure that this browser extension has a Github repo to ensure the developer is transparent with their tools. Additionally, I took a look at the source code to ensure there weren't any unnecessary permissions requested in the manifest or any weird scripts running while installed. Although I'm not an application security expert, I did some due diligence before using and I suggest you adopt that practice as well before using new tools.

A chrome extension to make search filtering more accessible. - cgerikj/Search-Date-Bar
Github Repo for Google Search by Time/Date

Remember, OSINT != tools. OSINT tools help you plan and collect data, but the end result of that tool is not OSINT. You have to analyze, verify, receive feedback, refine, and produce a final, actionable product of value before it can be intelligence.

Thanks for reading. If you enjoyed this post, make sure to subscribe. A new one just like this will be posted every Tuesday at 6:00 PM UTC-5:00.

Enjoying these posts? Subscribe for more