Skip to content

excessively long runtime? #96

@franke-biosaxs

Description

@franke-biosaxs

Dear all.

I am attempting to convert a ~16MB database dump (70 tables, the largest should have about 7000 rows) into something sqlite can digest. The script initially provided some output quickly (~40 tables), and since that first output 19 hours have passed where nothing much has happened. The memory usage of awk (macOS, awk version 20200816) increased from 14MB to 16MB, but there is no further indication of progress. From what I can tell based on the output so far, it is stuck in the presumably largest table. Whether it is something in the data or an issue in the script (exponential runtime on large tables?) I can not tell.

As my knowledge of awk is limited to the knowledge of its existence, my question: does a 19+ hour runtime make any sense for a dump this size? If not, how can I figure out whether it is some text/data in my dump? Any suggestions on how to debug/improve the script it is a runtime issue?

To note, the dump was created by an old phpBB forum board, I have no knowledge or control of the dump parameters.

Thanks

Daniel

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions