Craw­ler PAM, Asset Manage­ment and Order

Our soft­ware experts deve­lo­ped the OFFICE ASSET Craw­ler in order to agree with the requi­re­ments of secu­re and regu­lar data trans­mis­si­on. The craw­ler collects data in the cus­to­mer net­work via SCCM or MS Graph and trans­mit it to OFFICE ASSET without using a per­ma­nent inter­face to the company’s network. 

The craw­ler can also be used for other modules:

  • PAM – It sear­ches your net­work for exis­ting prin­ters ful­ly auto­ma­ti­cal­ly and adds them to OFFICE ASSET. In addi­ti­on, it regu­lar­ly reads out all rele­vant data from the prin­ter (e.g. prin­ted pages, levels of prin­ter sup­plies, error messages and alert con­di­ti­ons). Based on this data, fur­ther pro­ces­ses, e.g. ful­ly auto­ma­ted orde­ring of requi­red sup­plies, can be initia­ted in OFFICE ASSET. 
  • Asset Manage­ment – By using SCCM or the MS Graph API, the company’s exis­ting assets are detec­ted and their inven­to­ry data were impor­ted into OFFICE ASSET. 
  • Order – For using the Modul Order the craw­ler can trig­ger PowerS­hell scripts in cus­to­mer net­works. With this func­tion an employee can order per­mis­si­ons or soft­ware instal­la­ti­on via indi­vi­dua­li­zed eShop, which are auto­matically­ exe­cu­t­ed.

Craw­ler-Down­loads

Craw­ler-Ver­si­on: 12.0
OFFICE ASSET – Ver­si­on: 8.0

Down­loads for older version

 

 

Crawl-E

The craw­ler for the home office: The Crawl-E was deve­lo­ped spe­ci­fi­cal­ly for use in home offices to read the print data of the extern­al­ly used com­pa­ny prin­ters.
It offers the fol­lowing advantages:

  • Rea­dout of indi­vi­du­al prin­ters by IP or hostname
  • Ful­ly preconfigured
  • Quick­ly and easi­ly rea­dy for use

Infor­ma­ti­on for sys­tem admi­nis­tra­tors:
In order for the user to down­load the alrea­dy pre-con­fi­gu­red ver­si­on of the Crawl-E in OFFICE ASSET, it must have been pre­vious­ly instal­led by a sys­tem admi­nis­tra­tor.
Here the cur­rent Crawl-E ver­si­on can be downloaded.

Dis­con­ti­nued versions

 

Craw­ler-Ver­si­onsNoteEnd of support
up to Ver­si­on 7Update requi­redThe sup­port has ended
Ver­si­ons 7 – 9Update recom­men­dedSup­port will end 2022

Versions Summary of the most important changes

Ver­si­on 12

12.0
  • Con­ta­ct now leads direct­ly to the SiBit homepage.

Ver­si­on 11

11.3
  • Java chan­ges for Linux crawlers. 

11.2

  • The lan­guage of the GUI is now also spe­ci­fied via the setup. 

11.1

  • Minor bug fixes to the GUI. 

11.0

  • The craw­ler now has a web inter­face. This enab­les the ser­vice to be ope­ra­ted from other computers. 
  • The old GUI has been removed. 

Ver­si­on 10 

10.1

  • The MS Graph inter­face now sup­plies the sys­tem owner’s e-mail address with the assets so that the asso­cia­ted per­son can be (alter­na­tively) iden­ti­fied.

10.0

  • First ver­si­on of the Crawl-E. 
  • No chan­ges to the crawler. 

    Ver­si­on 9 

    9.3
    • Now inclu­des and uses Java 11.0.8 + 10 (Zulu 11.41.23) from Azul Systems. 
    • The craw­ler can now exe­cu­te PowerS­hell scripts, which are set via the OA interface. 

    9.2

    • The craw­ler now sends the local IP and the host name of the com­pu­ter on which it is installed. 

    9.1.1

    • Bug­fix: If the zone is chan­ged in the sup­port tab, all zone-spe­ci­fic infor­ma­ti­on is remo­ved from the Craw­ler H2 data­ba­se (crawl con­fi­gu­ra­ti­ons, scan con­fi­gu­ra­ti­ons, etc.). 

    9.1

    • The manu­al start of the crawl timer has been removed. 
    • The crawl timer will now be restar­ted auto­ma­ti­cal­ly after it has ended. 
    • Various minor impro­ve­ments to the Java Deamons. 

    9.0

    • The craw­ler can now com­mu­ni­ca­te with the Micro­soft Graph API. 

    Archive 

    Ver­si­on 8
    8.0
    • The craw­ler can now per­form SCCM queries. 
    • Known Issue: The trans­la­ti­ons of the craw­ler are not working pro­per­ly. Fixed in ver­si­on 9.0.

    Ver­si­on 7 

    7.6
    • Minor bug fixes. 
    7.5.1
    • The pass­word que­ry for the sup­port tab of the craw­ler now also works on MAC OS. 
    7.5
    • Bug­fix: A scan could not be saved in the H2 data­ba­se if the result value had more than 255 cha­rac­ters. maxi­mum cha­rac­ters were incre­a­sed to 2500 and a log was writ­ten if this should occur again. 
    7.4
    • The REST ser­ver now lis­tens to local­host by default, but can be set using the “net­work­In­ter­face” set­ting in the con­fi­gu­ra­ti­on file. 
    7.3
    • A crawl is now also car­ri­ed out if the­re is no crawl con­fi­gu­ra­ti­on. This is requi­red to trans­fer results from local prin­ters if necessary. 
    7.2
    • Now out­puts messages for OS that have no GUI, if root rights are mis­sing for the .sh scripts. 
    • Various code adjus­t­ments to conventions. 
    7.1
    • Sup­port and set­up for Linux ope­ra­ting systems. 
    • Bug­fix: Fixed some spel­ling errors in the translations. 
    • Bug­fix: The user inter­face tri­es to load the pass­word for REST com­mu­ni­ca­ti­on with the ser­vice for 10 seconds. It is then assu­med that the ser­vice has not star­ted. (Only rele­vant for the first start of the crawler). 
    • Bug­fix: The ter­mi­nal user inter­face (for examp­le for Linux OS) now out­puts unknown errors. 
    7.0
    • RMI exch­an­ged for REST (Jer­sey V 2.25.1) (com­mu­ni­ca­ti­on GUI & lt; – & gt; Service). 
    • REST uses a basic authen­ti­ca­ti­on with pass­word (see Client.config) and an optio­nal user name. 
    • The craw­ler can now accept results from a local USBPrint­Rea­der (also via REST) ​​and send them to the OA in the crawl interval. 
    • GUI and ser­vice depen­den­cy manage­ment swit­ched to Maven. 
    • Hiber­na­te update from 4.1.9 to 4.3.11 final (no major update). 
    • H2 dri­ver update from 1.3.176 to 1.4.196 (1.4.197 con­tains cri­ti­cal bugs). 
    • The craw­ler now emp­ties its crawl con­fi­gu­ra­ti­ons when the OA does not pro­vi­de any more. 
    • Whe­ther the Pam­Ser­vice is run­ning is now also che­cked via REST and no lon­ger via a VB script. 
    • The dis­tinc­tion bet­ween whe­ther the ser­vice is unavail­ab­le or not even run­ning has been removed. 
    • Main­ly ser­ves as pre­pa­ra­ti­on for the Unix crawler. 
    • Bug­fix: Dele­ting and recrea­ting the result tables was never triggered. 

     

    3.x – 6.x This ver­si­on has been dis­con­ti­nued and is no lon­ger sup­por­ted!

    We stron­gly advi­se against using this ver­si­on. Secu­ri­ty mecha­nisms are not suf­fi­ci­ent by today’s stan­dards.

     

     

    Logo-Jobs
    Logo-Newsletter
    Mitglied im Händlerbund fair-commerce
    Icon-LinkedIn Icon-Xing Icon-Instagramm Icon-Facebook Icon-YouTube