Data integrity
Purpose
Data integrity is essential to research, regardless of data classification, since unauthorized modifications to your data can disrupt a project, produce incorrect results, undermine trust in academic research, and affect your reputation. It is important that throughout the research workflow and lifecycle that the integrity of the data is verifiable.
Audience
researchers IT staff
On this page
Initial considerations
Establish a resilient backup strategy and back up your data.
Should unauthorized modifications take place, it is important that you have a “clean” (unaffected) backup to revert back to.
Departmental and divisional resources and recommendations.
Contact your local IT group supported or recommended file verification and data integrity tools that are available to you.
What can I do?
Compute and check checksums on your integral data files.
Keep the generated checksums secure and separate from your data/file, since modified checksum outputs are no longer useful.
Hashing algorithms provide a unique output sequence for every unique input. This means that even if a single bit of a file has been altered, a radically different output sequence will be produced, even if the input file might initially appear unmodified.
Windows
File Checksum Integrity Verifier
MacOS
SHA Checksum command
Linux
SHA256 Checksum command
Maintain version control.
Practicing manual version control or leveraging a version control system allows you to properly manage and retain the numerous revisions to a file that might take place over the course of a task or project.
Search
Additional help
General
Contact us | Information Security (IS)
Contact us | Information Technology (IT)
Researchers
https://security.utoronto.ca/services/research-information-security-program/
Related articles