Crawler PAM, Asset Management and Order
Our software experts developed the OFFICE ASSET Crawler in order to agree with the requirements of secure and regular data transmission. The crawler collects data in the customer network via SCCM or MS Graph and transmit it to OFFICE ASSET without using a permanent interface to the company’s network.
The crawler can also be used for other modules:
- PAM – It searches your network for existing printers fully automatically and adds them to OFFICE ASSET. In addition, it regularly reads out all relevant data from the printer (e.g. printed pages, levels of printer supplies, error messages and alert conditions). Based on this data, further processes, e.g. fully automated ordering of required supplies, can be initiated in OFFICE ASSET.
- Asset Management – By using SCCM or the MS Graph API, the company’s existing assets are detected and their inventory data were imported into OFFICE ASSET.
- Order – For using the Modul Order the crawler can trigger PowerShell scripts in customer networks. With this function an employee can order permissions or software installation via individualized eShop, which are automatically executed.
Crawler-Downloads
Crawl-E
The crawler for the home office: The Crawl-E was developed specifically for use in home offices to read the print data of the externally used company printers. It offers the following advantages:
- Readout of individual printers by IP or hostname
- Fully preconfigured
- Quickly and easily ready for use
Information for system administrators: In order for the user to download the already pre-configured version of the Crawl-E in OFFICE ASSET, it must have been previously installed by a system administrator.
Discontinued versions
Crawler-Versions | Note | End of support |
up to Version 7 | Update required | The support has ended |
Versions 7 – 9 | Update recommended | Support will end 2022 |
Versions Summary of the most important changes
Crawler Version 20.1/.2
- The maximum working memory used by the crawler can now be set via the configuration interface.
- The crawler now indicates when a new version is available.
- You can now set up your own certificate for communication with the configuration page.
- After executing the setup, the page for configuring the crawler can be opened directly.
- Underlying libraries updated to improve stability and security.
Crawler Version 20
- The crawler now makes it possible to act as a proxy server for the printers' websites so that they can also be accessed via a cloud server. (Beta function!)
- Java Major version updated to 17.
- Crawler configuration interface updated and display improved.
Crawler Version 19
- The MS-Graph API can now also read user groups.
- The resulting URL may have been wrapped.
- The Linux version can now be executed with less privileged rights.
- Fixed a problem where an update may not have worked.
Crawler Version 18
- The GUI now displays the URL under which the crawler can be reached.
- In a Docker container, it was not possible to connect to the crawler service using HTTPS.