A Python crawler for extensions from the Chrome Web Store.
Go to file
Achim D. Brucker 216fd64b90 Removed non required rm command. 2017-08-19 12:33:34 +01:00
ExtensionCrawler Merge branch 'master' of logicalhacking.com:BrowserSecurity/ExtensionCrawler 2017-08-09 13:06:42 +01:00
queries Added sql query. 2017-08-16 21:32:09 +01:00
scripts Removed non required rm command. 2017-08-19 12:33:34 +01:00
sge Fixed grepper sge. 2017-08-14 15:15:50 +01:00
.gitignore Added .swp file to gitignore. 2017-06-20 08:09:40 +01:00
LICENSE initial commit 2016-09-08 20:43:35 +02:00
README.md Added description of extract-crx. 2017-07-30 09:59:42 +01:00
crawler Refactoring: Moved default configuration to config module. 2017-07-29 12:36:20 +01:00
create-db Fixed import. 2017-07-31 16:19:20 +01:00
crx-tool Refactoring. 2017-07-29 09:05:16 +01:00
extract-crx Improved error message in case CRX is not found. 2017-08-18 13:27:17 +01:00
grepper Updated greper. 2017-08-14 14:40:10 +01:00
requirements.txt Simplified requirements.txt using 'pipreqs --force .'. 2017-08-18 12:37:20 +01:00

README.md

ExtensionCrawler

A collection of utilities for downloading and analyzing browser extension from the Chrome Web store.

  • crawler: A crawler for extensions from the Chrome Web Store.
  • crx-tool: A tool for analyzing and extracting *.crx files (i.e., Chrome extensions). Calling crx-tool.py <extension>.crx will check the integrity of the extension.
  • extract-crx: A simple tool for extracint *.crx files from the tar-based archive hierarchy.
  • create-db: A tool for creating/initializing the database files from already existing extension archives.

The utilities store the extensions in the following directory hierarchy:

   archive
   ├── conf
   │   └── forums.conf
   ├── data
   │   └── ...
   └── log
       └── ...

The crawler downloads the most recent extension (i.e., the *.crx file as well as the overview page. In addition, the conf directory may contain one file, called forums.conf that lists the ids of extensions for which the forums and support pages should be downloaded as well. The data directory will contain the downloaded extensions as well as sqlite files containing the extracted meta data. The sqlite files can easily be re-generated using the create-db tool.

All utilities are written in Python 3.x. The required modules are listed in the file requirements.txt.

Team

License

This project is licensed under the GPL 3.0 (or any later version).