kopal Library of Retrieval and Ingest issueshttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues2022-10-11T09:43:33Zhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/1Unmarshalling Error: java.lang.IllegalArgumentException: Illegal character in...2022-10-11T09:43:33ZStefan E. Funkfunk@sub.uni-goettingen.deUnmarshalling Error: java.lang.IllegalArgumentException: Illegal character in opaque partSee https://gitlab.gwdg.de/jose.calvotello/eltec_import/-/issues/19:
Error at de.langzeitarchivierung.kolibri.actionmodule.textgrid.SubmitFiles:
`Exception in thread "pool-1-thread-2" javax.xml.ws.soap.SOAPFaultException: Unmarshalling...See https://gitlab.gwdg.de/jose.calvotello/eltec_import/-/issues/19:
Error at de.langzeitarchivierung.kolibri.actionmodule.textgrid.SubmitFiles:
`Exception in thread "pool-1-thread-2" javax.xml.ws.soap.SOAPFaultException: Unmarshalling Error: java.lang.IllegalArgumentException: Illegal character in opaque part at index 12: viaf:1657716`https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/2Workflow ends before all queues are finished2021-11-19T15:05:42ZStefan E. Funkfunk@sub.uni-goettingen.deWorkflow ends before all queues are finishedWorkflow says:
`[#1] de.langzeitarchivierung.kolibri.WorkflowTool.process() [2021-11-19T15:02:30.129+01:00]
[INFO] The process queue has been processed. Total time elapsed: 33 minutes 532 milliseconds
[INFO] Everything has bee...Workflow says:
`[#1] de.langzeitarchivierung.kolibri.WorkflowTool.process() [2021-11-19T15:02:30.129+01:00]
[INFO] The process queue has been processed. Total time elapsed: 33 minutes 532 milliseconds
[INFO] Everything has been done! This process was logged to file: /Users/fugu/work/kolibri-distant-reading_fu/folders/log/kolibri.log
[INFO] *removing log handlers*
[INFO] *stop*`
...and processes are still running.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/3Add import.zip as gitlab flow artifact?2021-11-24T13:43:32ZStefan E. Funkfunk@sub.uni-goettingen.deAdd import.zip as gitlab flow artifact?Normally we commit every time the version changes a new import.zip that is build via mvn. Could we do this using Gitlab artifacts?Normally we commit every time the version changes a new import.zip that is build via mvn. Could we do this using Gitlab artifacts?https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/5Reduce logs while building project2021-11-24T16:41:53ZStefan E. Funkfunk@sub.uni-goettingen.deReduce logs while building projectTo avoid:
`Job's log exceeded limit of 4194304 bytes.
Job execution will continue but no more output will be collected.`
...maybe we do not need all the test logs during Gitlab builds?To avoid:
`Job's log exceeded limit of 4194304 bytes.
Job execution will continue but no more output will be collected.`
...maybe we do not need all the test logs during Gitlab builds?https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/7Reduce JUnit tests to less objects for TG-import2021-11-24T16:48:28ZStefan E. Funkfunk@sub.uni-goettingen.deReduce JUnit tests to less objects for TG-importTo gain a bit of build duration time :-) We do not need to test the BIIIIG projects every time, I guess!To gain a bit of build duration time :-) We do not need to test the BIIIIG projects every time, I guess!Stefan E. Funkfunk@sub.uni-goettingen.deStefan E. Funkfunk@sub.uni-goettingen.dehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/11Check pathes to work files in editon metadata files [was #33057]2021-11-25T15:13:36ZGhost UserCheck pathes to work files in editon metadata files [was #33057]<isEditionOf>BLUBB/ENG18400.work</isEditionOf> must be tested for existance, there are errors: java.io.IOException: info.textgrid.namespaces.middleware.tgcrud.services.tgcrudservice.IoFault: info.textgrid.namespaces.middleware.tgcrud.ser...<isEditionOf>BLUBB/ENG18400.work</isEditionOf> must be tested for existance, there are errors: java.io.IOException: info.textgrid.namespaces.middleware.tgcrud.services.tgcrudservice.IoFault: info.textgrid.namespaces.middleware.tgcrud.services.tgcrudservice.IoFault: Failure storing to the Sesame databasehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/15koLibRI CLI: DeleteFiles configuration needs an existing hotfolder and it nee...2021-11-25T15:15:02ZGhost UserkoLibRI CLI: DeleteFiles configuration needs an existing hotfolder and it needs to be filled with some file(s) [was #30147]You shuld not be needing a (filled) hotfolder for deleting files! Hotfolder must only be existing if ID mapping files are used for deletion. If configuration is set to "uri:" od "tgpr:" hotfolder should not be needed/checked at all.You shuld not be needing a (filled) hotfolder for deleting files! Hotfolder must only be existing if ID mapping files are used for deletion. If configuration is set to "uri:" od "tgpr:" hotfolder should not be needed/checked at all.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/17kolibri-addon-textgrid-import mit Policy "publish_import" (sicher auch "delet...2021-11-25T15:14:49ZGhost Userkolibri-addon-textgrid-import mit Policy "publish_import" (sicher auch "delete_import) fängt nur an, wenn der konfigurierte Hotfolder nicht leer ist. [was #23988]Ist der Hotfolder (z. B. ./folders/hotfolder) leer, dann laufen die Policies "publish_import" und "delete_import" Workflow gar nicht erst an! Es muss nur eine einzige Datei im Hotfolder vorhanden sein, dann rennt er los (diese darf auch ...Ist der Hotfolder (z. B. ./folders/hotfolder) leer, dann laufen die Policies "publish_import" und "delete_import" Workflow gar nicht erst an! Es muss nur eine einzige Datei im Hotfolder vorhanden sein, dann rennt er los (diese darf auch leer sein!). Das ist nicht Sinn der Sache und soll gefixed werden. Als Workaround bitte einfach eine Datei ins Hotfolder kopieren oder dort eine anlegen/speichern.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/19unable to upload large size file (koLibRI) [was #23537]2021-11-25T15:13:18ZGhost Userunable to upload large size file (koLibRI) [was #23537]…nor the kolibir import is successful for a 60GB binary file. org.apache.cxf.interceptor.Fault: Could not write attachments.…nor the kolibir import is successful for a 60GB binary file. org.apache.cxf.interceptor.Fault: Could not write attachments.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/20No aggregation if only one fileGrp is ingested. [was #20599]2021-11-25T15:13:18ZGhost UserNo aggregation if only one fileGrp is ingested. [was #20599]If only one METS File Group is ingested, it is not necessary to create a special aggregation during koLibRI Ingest.If only one METS File Group is ingested, it is not necessary to create a special aggregation during koLibRI Ingest.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/24Enhance the (automatic) mimetype extraction of the DfgViewerMetadataProcessor...2021-11-25T15:13:18ZGhost UserEnhance the (automatic) mimetype extraction of the DfgViewerMetadataProcessor, and others [was #9462]Enhance the (automatic) mimetype extraction of the DfgViewerMetadataProcessor, and othersEnhance the (automatic) mimetype extraction of the DfgViewerMetadataProcessor, and othershttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/25PIDs should be re-used if objects were deleted from the sandbox [was #9331]2021-11-25T15:13:18ZGhost UserPIDs should be re-used if objects were deleted from the sandbox [was #9331]PIDs should be re-used if objects were deleted from the sandboxPIDs should be re-used if objects were deleted from the sandboxhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/26Aggregation-Import must ensure creation of a root-edition or collection [was ...2022-09-23T12:45:06ZGhost UserAggregation-Import must ensure creation of a root-edition or collection [was #12181]If importing using the aggregation_import policy, it must be ensured that a root edition or colection ia (a) existing or (b) must be forced by the import module!If importing using the aggregation_import policy, it must be ensured that a root edition or colection ia (a) existing or (b) must be forced by the import module!https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/27TG-import external should provide a minimal METS import profile [was #9459]2021-11-25T15:13:18ZGhost UserTG-import external should provide a minimal METS import profile [was #9459]The METS import right now creates lots of aggreagations, reassembling the internal METS Structure. It would be nice to have the possbility to chosse a minimized import, which just creates an edititon and only one aggregation the selected...The METS import right now creates lots of aggreagations, reassembling the internal METS Structure. It would be nice to have the possbility to chosse a minimized import, which just creates an edititon and only one aggregation the selected images (e.g. MAX) in an image Folder, e.g.: /TheMetsEdidition/ +/theMetsFile.xml +/images/ ++/Vorderdeckel-MAX.jpg ++/ ... ++/Vita-Alexandri-Magni-Blatt21r.jpg ++/ ... ++/Rückdeckel-MAX.jpghttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/29Check RDF metadata at re-import using TG-import. Have we to delete e.g. the i...2021-11-25T15:13:18ZGhost UserCheck RDF metadata at re-import using TG-import. Have we to delete e.g. the image metadata to not have two of them? [was #14150]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/30Make it possible to use URLs for DFG-Viewer METS import [was #9188]2021-11-25T15:13:19ZGhost UserMake it possible to use URLs for DFG-Viewer METS import [was #9188]Make it possible to use URLs for DFG-Viewer METS importMake it possible to use URLs for DFG-Viewer METS importhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/56Attributes sorted alphabetically after import [was #9288]2021-11-25T15:12:58ZGhost UserAttributes sorted alphabetically after import [was #9288]Attributes sorted alphabetically after importAttributes sorted alphabetically after importhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/57Link Rewriter should rewrite urls in <?xsl-stylesheet?> processing instructio...2021-11-25T15:12:58ZGhost UserLink Rewriter should rewrite urls in <?xsl-stylesheet?> processing instruction [was #9438]Link Rewriter should rewrite urls in <?xsl-stylesheet?> processing instructionLink Rewriter should rewrite urls in <?xsl-stylesheet?> processing instructionhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/58Ingest von Metadaten aus Datenbanken [was #9130]2021-11-25T15:12:58ZGhost UserIngest von Metadaten aus Datenbanken [was #9130]Ingest von Metadaten aus DatenbankenIngest von Metadaten aus Datenbankenhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/99Editions inside collections should be metadata checked [was #8919]2021-11-25T15:34:03ZGhost UserEditions inside collections should be metadata checked [was #8919]Editions inside collections should be metadata checked (I think)Editions inside collections should be metadata checked (I think)https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/106TG-publish must check if a given work (isEditionOf relation) really exists. [...2021-11-25T15:34:03ZGhost UserTG-publish must check if a given work (isEditionOf relation) really exists. [was #8976]TG-publish must check if a given work (isEditionOf relation) really exists.TG-publish must check if a given work (isEditionOf relation) really exists.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/107Cannot publish more than one TGO at once [was #8996]2021-11-25T15:34:03ZGhost UserCannot publish more than one TGO at once [was #8996]Cannot publish more than one TGO at onceCannot publish more than one TGO at oncehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/111TG-lab GUI (and TG-publish) should use nearlyPublish [was #9075]2021-11-25T15:34:02ZGhost UserTG-lab GUI (and TG-publish) should use nearlyPublish [was #9075]TG-lab GUI (and TG-publish) should use nearlyPublishTG-lab GUI (and TG-publish) should use nearlyPublishhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/116Make the rewriting configurable in any way... [was #9145]2022-02-14T19:45:12ZGhost UserMake the rewriting configurable in any way... [was #9145]Make the rewriting configurable in any way...Make the rewriting configurable in any way...https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/120Let the TG-lab publish into the Sandbox, not directly anymore. [was #9150]2021-11-25T15:34:03ZGhost UserLet the TG-lab publish into the Sandbox, not directly anymore. [was #9150]We need to publish into the Sandbox from the TG-lab. So the user can publish, and then view the published objects in the sandbox browser (e.g. by using the eclipse browser and directly show the browseable editon. Then the user can press ...We need to publish into the Sandbox from the TG-lab. So the user can publish, and then view the published objects in the sandbox browser (e.g. by using the eclipse browser and directly show the browseable editon. Then the user can press REALLY publish in the publish GUI. So, first PROOF, then PUBLISH (by default into the sandbox, and finally FINALLY PUBLISH. Would that be fine? Please comment this issue!https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/128TG-publish must look for another way to check the references in XML files of ...2021-11-25T15:34:02ZGhost UserTG-publish must look for another way to check the references in XML files of objects to publish [was #9460]TG-publish must look for another way to check the references in XML files of objects to publishTG-publish must look for another way to check the references in XML files of objects to publishhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/134TG-publish must cope with Runtime-Exceptions in the Publish-Response [was #11...2021-11-25T15:34:03ZGhost UserTG-publish must cope with Runtime-Exceptions in the Publish-Response [was #11882]If getting runtime exceptions in Actionmodules (like an org.elasticsearch.common.netty.channel.channelexception), ERROR must be set in the Publish-Response!If getting runtime exceptions in Actionmodules (like an org.elasticsearch.common.netty.channel.channelexception), ERROR must be set in the Publish-Response!https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/136Handle dependencies of policy modules [was #16469]2021-11-25T15:34:03ZGhost UserHandle dependencies of policy modules [was #16469]such as "Rewrite PIDS in Mets = true" and no PIDs are generated at all...such as "Rewrite PIDS in Mets = true" and no PIDs are generated at all...https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/137Publizierte Daten erscheinen nicht im Repositorium [was #18770]2021-11-25T15:34:03ZGhost UserPublizierte Daten erscheinen nicht im Repositorium [was #18770]Gemeldet von Michael Bender (TU Darmstadt) Die folgenden Schritte führen zu dem Problem: 1. publiziert, Editon mit Werk und XSLT 2.Meldung: Erfolgreich! Das ist passiert: Nicht im Rep aufzufinden. Stattdessen hätte ich dies erwartet: Woh...Gemeldet von Michael Bender (TU Darmstadt) Die folgenden Schritte führen zu dem Problem: 1. publiziert, Editon mit Werk und XSLT 2.Meldung: Erfolgreich! Das ist passiert: Nicht im Rep aufzufinden. Stattdessen hätte ich dies erwartet: Wohle im Rep aufzufinden!https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/144Timeout bei Publikation [was #34059]2021-11-25T15:34:03ZGhost UserTimeout bei Publikation [was #34059]während Ausführung des UpdateTGAuth-Moduls. Während der Publikation einer Kollektion mit drei Unterkollektionen (in verschiedenen Projekten) kam es zu einem Timeout, so dass zwar alle Objekte in den statischen Storage geschoben wurden, j...während Ausführung des UpdateTGAuth-Moduls. Während der Publikation einer Kollektion mit drei Unterkollektionen (in verschiedenen Projekten) kam es zu einem Timeout, so dass zwar alle Objekte in den statischen Storage geschoben wurden, jedoch nicht alle im nächsten Modul ein isPublic-Flag im TG-auth bekommen hatten.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/146Fortsetzung eines fehlgeschlagenen Publikations-Prozesses implementieren [was...2021-11-25T15:36:29ZGhost UserFortsetzung eines fehlgeschlagenen Publikations-Prozesses implementieren [was #34061]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/149DH-import implementieren [was #21546]2021-11-30T11:55:39Zdh-import-importDH-import implementieren [was #21546]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/150Import-API testen mit Daten des Kenom Projekts [was #26618]2022-09-16T11:03:32Zdh-import-importImport-API testen mit Daten des Kenom Projekts [was #26618]Aussuchen einer Kollektion mit recht vielen Objekten, mit denen wir dann die Publish API testen.Aussuchen einer Kollektion mit recht vielen Objekten, mit denen wir dann die Publish API testen.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/151Implement Publish API [was #26613]2021-11-30T11:55:39Zdh-import-importImplement Publish API [was #26613]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/152Documentation of DH-publish API [was #26614]2021-11-30T11:57:23Zdh-import-importDocumentation of DH-publish API [was #26614]We need to know:
- how API calls are working
- how the RDF file has to be structured
- mandatory metadata
- add examples
- ...We need to know:
- how API calls are working
- how the RDF file has to be structured
- mandatory metadata
- add examples
- ...https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/166Wenn Publizieren hängt, fehlt ein Cancel button! [was #18743]2021-11-30T12:04:13Zdh-import-importWenn Publizieren hängt, fehlt ein Cancel button! [was #18743]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/168Tokens für DH-publish [was #21544]2021-11-30T12:04:13Zdh-import-importTokens für DH-publish [was #21544]Tokens shall be used for the import API. We take the token from the Publikator and make it revealable from there! We test then in DH-publish if stopring of data will succeed. If yes, user is allowed to publish, if not, not.Tokens shall be used for the import API. We take the token from the Publikator and make it revealable from there! We test then in DH-publish if stopring of data will succeed. If yes, user is allowed to publish, if not, not.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/175Für Publish-Progress: Größe der Dateien berücksichtigen [was #22489]2021-11-30T12:04:13Zdh-import-importFür Publish-Progress: Größe der Dateien berücksichtigen [was #22489]3*700MB Dateien stehen laaange bei 76%, es müsste ein Fortschritt erhöht und in die PublishResponse geschrieben werden, während des DH-crud-/Upload-Call…3*700MB Dateien stehen laaange bei 76%, es müsste ein Fortschritt erhöht und in die PublishResponse geschrieben werden, während des DH-crud-/Upload-Call…https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/181Nachnutzen von Infos von fehlerhaften Publikationsvorgängen [was #22702]2021-11-30T12:04:13Zdh-import-importNachnutzen von Infos von fehlerhaften Publikationsvorgängen [was #22702]Wurden z. B. schon PIDs und/oder DOIs erstellt, und es gab einen Fehler NACH der Erstellung, sollen diese beim nächsten Lauf von Publish nachgenutzt werden. Hierzu könnten diese Informationen in einer Datei im OwnStorage gespeichert werden.Wurden z. B. schon PIDs und/oder DOIs erstellt, und es gab einen Fehler NACH der Erstellung, sollen diese beim nächsten Lauf von Publish nachgenutzt werden. Hierzu könnten diese Informationen in einer Datei im OwnStorage gespeichert werden.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/182Check if a collection already has been created in case of an error while publ...2021-11-30T12:04:13Zdh-import-importCheck if a collection already has been created in case of an error while publishing [was #13688]https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/185Publish status toggles from error to draft [was #23009]2021-11-30T12:04:13Zdh-import-importPublish status toggles from error to draft [was #23009]create a new collection in publikator, update one file now publish collection (don't add any metadata fields) I will fail because of missing metadata. Try to publish this collection 6-10 times suddenly Publish reports as status ``` {"sta...create a new collection in publikator, update one file now publish collection (don't add any metadata fields) I will fail because of missing metadata. Try to publish this collection 6-10 times suddenly Publish reports as status ``` {"status":"ERROR","uri":"https://de.dariah.eu/storage/EAEA0-974C-2022-F73D-0","module":"de.langzeitarchivierung.kolibri.actionmodule.dariahde.publish.ReadCollectionModels","progress":16} {"status":"DRAFT","uri":"https://de.dariah.eu/storage/EAEA0-974C-2022-F73D-0","module":"de.langzeitarchivierung.kolibri.actionmodule.dariahde.publish.ReadCollectionModels","progress":16} ``` The calls follow a few milliseconds. I am exceptiing that it stays on error.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/188DH-publish service rubustness for API calls [was #26615]2021-11-30T12:04:13Zdh-import-importDH-publish service rubustness for API calls [was #26615]For now only Publikator can send RDF files to DH-publish. If everyone can, we must secure robustness of DH-publish service to cope with false RDF files.For now only Publikator can send RDF files to DH-publish. If everyone can, we must secure robustness of DH-publish service to cope with false RDF files.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/195logging wrong on data queue param… [was #34362]2021-11-30T12:04:14Zdh-import-importlogging wrong on data queue param… [was #34362]de.langzeitarchivierung.kolibri.processstarter.dariahde.publish.DHPublish.run() [2020-12-10T14:09:55.360+01:00] [INFO] DH-publish#PUBLISH started [9.4.7-DH-SNAPSHOT+202012071337] [INFO] Checking #PUBLISH data queue every less than a mill...de.langzeitarchivierung.kolibri.processstarter.dariahde.publish.DHPublish.run() [2020-12-10T14:09:55.360+01:00] [INFO] DH-publish#PUBLISH started [9.4.7-DH-SNAPSHOT+202012071337] [INFO] Checking #PUBLISH data queue every less than a millisecondhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/199Put Tika extracted mimetype into ADMMD dcterms:format (I suppose) [was #35617]2021-11-30T12:04:14Zdh-import-importPut Tika extracted mimetype into ADMMD dcterms:format (I suppose) [was #35617]We would so be independent from proprietary mimetypes like “application/vnd.openxmlformats-officedocument.wordprocessingml.document“, as given ba some operating systems for doc/rtf/word files.We would so be independent from proprietary mimetypes like “application/vnd.openxmlformats-officedocument.wordprocessingml.document“, as given ba some operating systems for doc/rtf/word files.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/200Publishing from nonpublic rep should create new revision automatically2023-02-14T17:35:11ZUbbo VeentjerPublishing from nonpublic rep should create new revision automaticallyIt would be great if tgpublish could create new nonpublic revisions automcatically when publishing. This would be a lot less troublesome and confusing to users. Best would be if new revisions could even be created before running a link-r...It would be great if tgpublish could create new nonpublic revisions automcatically when publishing. This would be a lot less troublesome and confusing to users. Best would be if new revisions could even be created before running a link-rewriter.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/203TG-Import: aggregation import does not put items into aggregations.2024-03-01T08:53:41ZMathias GoebelTG-Import: aggregation import does not put items into aggregations.uploading a couple of thousand documents in several subdirectories, not all items are put into the aggregations.uploading a couple of thousand documents in several subdirectories, not all items are put into the aggregations.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/206Document portalconfig and readme.md features also in TG-publish doc2022-10-11T09:42:59ZStefan E. Funkfunk@sub.uni-goettingen.deDocument portalconfig and readme.md features also in TG-publish dochttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/210Check for projectmanager right BEFORE releasing NearlyPublished flag in TG-pu...2023-06-09T14:26:40ZStefan E. Funkfunk@sub.uni-goettingen.deCheck for projectmanager right BEFORE releasing NearlyPublished flag in TG-publishAt least the first object can be nearlyPublished in ES and RDFDB, if called from TG-import.
A test for project manager rights should be implemented before deleting the flag im ES and RDFDB.At least the first object can be nearlyPublished in ES and RDFDB, if called from TG-import.
A test for project manager rights should be implemented before deleting the flag im ES and RDFDB.Publish Import Improvedhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/211Implement static ElasticSearch Client for TG-publish2023-06-09T12:44:01ZStefan E. Funkfunk@sub.uni-goettingen.deImplement static ElasticSearch Client for TG-publishSee /kolibri-tgpublish-service/src/main/java/de/langzeitarchivierung/kolibri/actionmodule/textgrid/publish/CopyElasticSearchIndex.java and /kolibri-tgpublish-service/src/main/java/de/langzeitarchivierung/kolibri/actionmodule/textgrid/pub...See /kolibri-tgpublish-service/src/main/java/de/langzeitarchivierung/kolibri/actionmodule/textgrid/publish/CopyElasticSearchIndex.java and /kolibri-tgpublish-service/src/main/java/de/langzeitarchivierung/kolibri/actionmodule/textgrid/publish/ReleaseNearlyPublishedRelation.javaPublish Import Improvedhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/213Get PID-Service client while publish is running, not at Service start2022-10-11T09:42:59ZStefan E. Funkfunk@sub.uni-goettingen.deGet PID-Service client while publish is running, not at Service startAnsonsten muss Publish neu gestartet werden, wenn der PID-Service mal down war bei Publish-StartAnsonsten muss Publish neu gestartet werden, wenn der PID-Service mal down war bei Publish-Starthttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/216tgrep landing pages: portalconfig and readme.md are implemented as worldReada...2022-10-11T09:42:59ZUbbo Veentjertgrep landing pages: portalconfig and readme.md are implemented as worldReadables - this is confusingwe should discuss if this is really necessary or if we could also allow readme.md and portalconfig.md without parent collection in the normal publication process. you don't see this in the lab, but get confused using the API. so at least...we should discuss if this is really necessary or if we could also allow readme.md and portalconfig.md without parent collection in the normal publication process. you don't see this in the lab, but get confused using the API. so at least we need to document this behaviour, if we don't fix it.
related to #214https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/223Check if validation of import can be improved2023-01-23T11:59:58ZStefan E. Funkfunk@sub.uni-goettingen.deCheck if validation of import can be improvedWe validate metadata files in module publishCheck, but we do not have `<generated>` metadata yet. So we can only ignore validation errors concerning this element. Therefore checks such as XML ID checks are not made, and false IDs not det...We validate metadata files in module publishCheck, but we do not have `<generated>` metadata yet. So we can only ignore validation errors concerning this element. Therefore checks such as XML ID checks are not made, and false IDs not detected. We have to see, that we can validate correctly without having the `<generated>` element. It is generated by TG-crud during #CREATE operation and given back from TG-crud.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/225Check publication process with “already-in-progress“ status2023-02-21T14:36:14ZStefan E. Funkfunk@sub.uni-goettingen.deCheck publication process with “already-in-progress“ statusTesten, was passiert, wenn ein zweiter Publikationsprozess für dieselbe Kollektion nochmals angestoßen wird (geht das im Publikator überhaupt? Was passiert über die API?), wenn die erste Publikation noch läuft. Es darf hier
- nicht zu e...Testen, was passiert, wenn ein zweiter Publikationsprozess für dieselbe Kollektion nochmals angestoßen wird (geht das im Publikator überhaupt? Was passiert über die API?), wenn die erste Publikation noch läuft. Es darf hier
- nicht zu einem Abbruch des Publikationsprozesses kommen (!) und
- die Fehlermeldung (Status) sollte transparent zur Nutzerin transportiert werden.https://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/235Check dry-run for not needed PID creation or publish checks, etc2023-06-09T14:31:30ZStefan E. Funkfunk@sub.uni-goettingen.deCheck dry-run for not needed PID creation or publish checks, etchttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/236Make publish-import workflow more robust2023-06-09T12:56:30ZStefan E. Funkfunk@sub.uni-goettingen.deMake publish-import workflow more robustsuch as: Maybe do not check for nearly published-ness, just do it all (again)?such as: Maybe do not check for nearly published-ness, just do it all (again)?Publish Import Improvedhttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/237Create new Process Starter for better “sandbox publishing“2023-06-09T15:23:58ZStefan E. Funkfunk@sub.uni-goettingen.deCreate new Process Starter for better “sandbox publishing“Publish Import ImprovedStefan E. Funkfunk@sub.uni-goettingen.deStefan E. Funkfunk@sub.uni-goettingen.dehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/239Error status must be renewed after re-run of publication2023-11-09T14:17:11ZStefan E. Funkfunk@sub.uni-goettingen.deError status must be renewed after re-run of publicationPlease try:
1. Create a new collection in the Publikator, add mandatory metadata
2. Create a new sub collection, add mandatory metadata
3. Publish the collection
4. ERROR: Empty collection
5. Add file to the sub collection
6. Publish a...Please try:
1. Create a new collection in the Publikator, add mandatory metadata
2. Create a new sub collection, add mandatory metadata
3. Publish the collection
4. ERROR: Empty collection
5. Add file to the sub collection
6. Publish again
7. The ERROR persists!
Please have a look into the DH-publish service!Stefan E. Funkfunk@sub.uni-goettingen.deStefan E. Funkfunk@sub.uni-goettingen.dehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/245Hotfix main branch with #2442024-03-07T16:33:04ZStefan E. Funkfunk@sub.uni-goettingen.deHotfix main branch with #244Stefan E. Funkfunk@sub.uni-goettingen.deStefan E. Funkfunk@sub.uni-goettingen.dehttps://gitlab.gwdg.de/dariah-de/kopal-library-of-retrieval-and-ingest/-/issues/246Hotfix main branch with #2432024-03-12T16:01:43ZStefan E. Funkfunk@sub.uni-goettingen.deHotfix main branch with #243Stefan E. Funkfunk@sub.uni-goettingen.deStefan E. Funkfunk@sub.uni-goettingen.de