Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 22 Next »

🎓 Purpose

Data integrity is essential to research, regardless of data classification, since unauthorized modifications to your data can disrupt a project, produce incorrect results, undermine trust in academic research, and affect your reputation. It is important that throughout the research workflow and lifecycle that the integrity of the data is verifiable.

👥 Audience

RESEARCHERS IT STAFF


🔖 Contents


(question) Initial considerations

Establish a resilient backup strategy and back up your data.

  • Should unauthorized modifications take place, it is important that you have a “clean” (unaffected) backup to revert back to.

Departmental and divisional resources and recommendations.


\uD83D\uDCD8 What can I do?

Compute and check checksums on your integral data files.

Keep the generated checksums secure and separate from your data/file, since modified checksum outputs are no longer useful.

  • Hashing algorithms provide a unique output sequence for every unique input. This means that even if a single bit of a file has been altered, a radically different output sequence will be produced, even if the input file might initially appear unmodified.

Windows

MacOS

Linux

Maintain version control.


  • No labels