A Python crawler for extensions from the Chrome Web Store.
Go to file
Michael Herzberg d2288e9a03 Don't push sge file separatly. 2017-08-23 18:10:31 +01:00
ExtensionCrawler Merge branch 'master' of logicalhacking.com:BrowserSecurity/ExtensionCrawler 2017-08-09 13:06:42 +01:00
queries Added sql query. 2017-08-16 21:32:09 +01:00
scripts Set sqlite pragmas. 2017-08-22 23:26:11 +01:00
sge Don't push sge file separatly. 2017-08-23 18:10:31 +01:00
.gitignore Added .swp file to gitignore. 2017-06-20 08:09:40 +01:00
LICENSE initial commit 2016-09-08 20:43:35 +02:00
README.md Add installation instructions to README 2017-08-18 17:00:23 +01:00
crawler Refactoring: Moved default configuration to config module. 2017-07-29 12:36:20 +01:00
create-db Also improved create-db script. 2017-08-23 18:04:33 +01:00
crx-tool Refactoring. 2017-07-29 09:05:16 +01:00
extract-crx Improved error message in case CRX is not found. 2017-08-18 13:27:17 +01:00
grepper Improved grepper. 2017-08-23 16:52:18 +01:00
requirements.txt Simplified requirements.txt using 'pipreqs --force .'. 2017-08-18 12:37:20 +01:00
setup.py Add setup.py 2017-08-18 17:00:10 +01:00

README.md

ExtensionCrawler

A collection of utilities for downloading and analyzing browser extension from the Chrome Web store.

  • crawler: A crawler for extensions from the Chrome Web Store.
  • crx-tool: A tool for analyzing and extracting *.crx files (i.e., Chrome extensions). Calling crx-tool.py <extension>.crx will check the integrity of the extension.
  • extract-crx: A simple tool for extracint *.crx files from the tar-based archive hierarchy.
  • create-db: A tool for creating/initializing the database files from already existing extension archives.

The utilities store the extensions in the following directory hierarchy:

   archive
   ├── conf
   │   └── forums.conf
   ├── data
   │   └── ...
   └── log
       └── ...

The crawler downloads the most recent extension (i.e., the *.crx file as well as the overview page. In addition, the conf directory may contain one file, called forums.conf that lists the ids of extensions for which the forums and support pages should be downloaded as well. The data directory will contain the downloaded extensions as well as sqlite files containing the extracted meta data. The sqlite files can easily be re-generated using the create-db tool.

All utilities are written in Python 3.x. The required modules are listed in the file requirements.txt.

Installation

Clone and use pip to install as a package.

git clone git@logicalhacking.com:BrowserSecurity/ExtensionCrawler.git
pip install -e ExtensionCrawler

Team

License

This project is licensed under the GPL 3.0 (or any later version).