I created a schedule for scrubbing my newly created pool and this morning decided to check the status of the task with the command zpool status. Do I understand correctly that the task was performed for some reason for the boot pool, and not the data pool? What am I doing wrong and how can I fix it?
πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/celgus_
πŸ“…︎ Apr 01 2021
🚨︎ report
I’m too sleep deprived from scrubbing out the file receive from finance to think of a better joke. I never understand why I receive a data dump every year instead of a clear guide of what the hell to pay people.
πŸ‘︎ 20
πŸ’¬︎
πŸ‘€︎ u/bad_armenian_juju
πŸ“…︎ Apr 12 2021
🚨︎ report
How serious do you think possible data-scrubbing by HF’s is to the GME MOASS? (them theoretically compiling a majority of our posted positions)

IMO I don’t believe they could get any legitimate data because wouldn’t it be easy to BS our positions and skew the data? Just seen it brought up multiple times.

View Poll

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Blacknblue202
πŸ“…︎ Feb 21 2021
🚨︎ report
Question regarding retained data scrubbing or purging

Hello all,

I was looking for guidance on data retention / anonymization. For PII/PHI under HIPAA I am having a difficult time finding about how long before data should be scrubbed or anonymized. Maybe this has to do with laws like this falling under the purview of each state if I am not mistaken? Is there a general rule of thumb for the amount of time that data is retained before identifiers are removed or is data purged all together?

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Dahbears
πŸ“…︎ Feb 20 2021
🚨︎ report
"Google exempts its own websites from Chrome's automatic data-scrubbing feature, allowing the ads giant to potentially track you even when you've told it not to." theregister.com/2020/10/1…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/UnknownCitiz3n
πŸ“…︎ Oct 28 2020
🚨︎ report
Is it worth enabling data checksum in a shared folder, when data scrubbing is already enabled in the storage pool?

In the "Shared Folder Creation Wizard" there is a check box for "data checksum for advanced data integrity", but I am about to create this share on the storage pool that already has data scrubbing enabled.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/chimp73
πŸ“…︎ Aug 04 2020
🚨︎ report
Scrubbing personal data on photos?

I'm wondering specifically about uploading pictures on to SA and sending pictures. I know the basics, don't upload anything that can be traced back to you like pictures in front of your work/school, pictures of your face, etc. However I know that pictures can have data of when/where they were taken if you know how to dig it up. I don't know the prevalence of privacy issues like this in the sugar world, maybe I'm paranoid/anxious but I'm just concerned for my safety from creeps.
What do you do to ensure privacy when you are looking for arrangements?

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/uncertainforecast
πŸ“…︎ May 08 2020
🚨︎ report
Which product should I choose for Data Scrubbing?

I am currently looking to purchase a Synology NAS to take advantage of data scrubbing. My main motivation is to keep long term data on it protected from bitrot and the occasional failure of a disk. I do have an off-site backup option but I'd like to have the confidence that the data on the local NAS is always correct, and also that the backup won't be contaminated later with incorrect data.

From what I've read, I need at least a recent device that supports BTRFS and RAID6. I will not be using it for anything else but storing data and sharing it using smb/nfs. Is the DS418 good enough for this purpose?

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/DrFlabbergasted
πŸ“…︎ Jun 17 2020
🚨︎ report
Data scrubbing- am I doing it right?

I have a DS920+ with about 12GB of mostly videos and photos. And I have been getting checksum errors about once per day. So I thought I best do data scrubbing. It’s be going for more then 48 hours. Maybe 72. And it’s only got to 23.01% is that normal

  1. Iron Wolf 8TB
  2. Iron Wolf 8TB
  3. Iron Wolf 8TB
  4. Seagate something? 3 TB

I have purchased a new Iron Wolf 8TB which is ready to install as I was thinking the checksum errors might be because of the old seagate

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/cryingcunt
πŸ“…︎ Aug 22 2020
🚨︎ report
Data Scrubbing on Array.

I have a DS2413+ with DX1211 external expansion. In the expansion I have 8 WD10TB drives and 4 WD12TB from shucked WD Easystores. They are configured in an SHR2 Array. That array has been grown over the past years starting from 2TB drives to 4s, 8s, 10s and 12s. I really love SHR. With increase in HD capacity, the array rebuild times have been growing as well. I went to implement stronger data protection on array by setting data scrubbing schedule. The data scrubing has been running over week.

Looking at the storage pool and the status. It says "Verifying drives in the background (Performing data scrubbing XX.YY%) . Looking at the resource monitor the drive utilization is at 90% over the entire array. And it does not appear that one or more drives is running slower (near similar read speeds).

The issue I am facing is that Percentage complete is growing about 1% per day. At this rate, it is going to take 100 days to scrub the entire array. Replacing a drive in the array, like replacing a 10tb with a 12tb take about 8 days.

Questions:

Is there reason data scrubbing takes much longer that an array rebuild?

Given the speed of data scrubing, what schedule should the data scrubing be on?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/lethalox
πŸ“…︎ Apr 14 2020
🚨︎ report
data scrubbing schedule?

Hello again,

ds218+ , 2x 4TB WD RED, raid-1 btrfs, 1 single volume, 85% filled (lately I have a lot of dynamic data, usage goes up and down a lot right now)

anyways, I thought I might add a data scrubbing schedule, for the lulz... erm, i mean, for good old maintenance ;)

but... i have no clue how often should i run this? every week? 4 weeks? 6 month?

any suggestions would be welcome :)

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/Phoenix_C64
πŸ“…︎ Jan 08 2020
🚨︎ report
Recommendations for scrubbing data from one table w/ a lot of redundant info into a smaller summary table.

I have a data logger that pulls in lab data. Samples are only taken every 10hr, but the data logger keeps the same data point until the next sample is taken and continues to record multiple time points between each different sample times but doesn't change the sample data until the next one is taken, so it looks something like the first table below.

What are some recommended ways I can scrub/pull that data into a new table so that I only have the first instance and first time point of each new lab sample (such as the table shown below)?

hour lab sample
2 4
4 4
6 4
8 4
10 10
12 10
14 10
16 10
18 10
20 20
22 20
24 20
26 20
28 20
30 50
32 50
hour lab data
2 4
10 10
20 20
30 50
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/GhostGlacier
πŸ“…︎ Jan 26 2020
🚨︎ report
Bad sectors - should I run Data Scrubbing before replacing the drive?

So I have a 4-bay Synology with 4x 2TB drives in it. One of the drives now got bad sectors (1 bad sector during last month’s check, 36 bad sectors now) and I want to replace it.

My question is this - should I run Data Scrubbing before I replace the drive? The setup is 7 years old and data scrubbing was never performed, so I think there is a solid chance that it would find and repair some bit rot. But if I replace the failing drive now, won’t the system lose the capability to repair bit rot after it rebuilds the RAID from the remaining 3 drives? I am not sure if the 1-disk-redundancy relies on the same redundant data as data scrubbing - if it does, I guess it would be better to run the data scrubbing now with the failing drive, and then replace the drive when all bit rots have been fixed?

Thanks for your input

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/PanTovarnik
πŸ“…︎ Aug 04 2019
🚨︎ report
[TASK] Scrubbing 88 pages of data for entry into Excel or Google Spreadsheet.

I have a one off project I need completed. It's basically scrubbing data out of a list and entering into an Excel or Google Spreadsheet.

The actual PDF has tons of unneeded info, and I would only need the four columns in the middle of it; number, street name, street type, and zip code.

EX: 123 MAIN ST 99999

I have 88 pages like this ~20 entries per page. I need to know:

  1. Time frame to complete
  2. Your fee

*DO NOT DM ME. If you cannot follow simple instructions, you will be ignored.

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/kodiak570
πŸ“…︎ May 12 2019
🚨︎ report
How often should i do data scrubbing?

Hey guys, I am a pretty satisfied owner of a DS216+ II, with 2 disks using SHR (essentially RAID1). My main purpose of the NAS is just local storage of all my data (I use C2 for encrypted off-site backups, and I am pretty satisfied).

My system has, clearly, no support for RAID scrubbing, but it does support 'data scrubbing' which I have never run. It's not on a schedule or anything?

Should I run it? How often?

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/Heliosmaster
πŸ“…︎ May 18 2019
🚨︎ report
Data scrubbing needed for disk redundancy?

So I have had a DS414 up and running for a couple of years now with 4 HDDs in SHR raid type. Today I noticed that my System Health is abnormal, and that my first drive has a "warning" status under "Heath status" and 7 bad sector counts thus far.
Anyway, while browsing the Synology OS I came across the data scrubbing tab, and noticed that data scrubbing is something that I haven't done ever. (because of lack of knowledge).

So my question is, is data scrubbing necessary for disk redundancy?. Can I buy a new drive tomorrow and and swap it with the failing drive without ever having done any data scrubbing, and keep my data intact?. Or am I out of luck?

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/Lovelime
πŸ“…︎ Oct 11 2019
🚨︎ report
Extended SMART test still necessary with btrfs data scrubbing?

I'm running a DS1819+ with 4x4TB HDDs in SHR2 mode and btrfs (with advanced data integrity) as the file system. On my previous unit with ext4 I ran a weekly quick SMART test and a monthly extended SMART test on the HDDs.

Now with the new unit I scheduled a monthly data scrub in addition to the SMART tests, but I'm wondering, if the data scrub makes the extended SMART test superfluous.

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/tangofan
πŸ“…︎ Jun 29 2019
🚨︎ report
Synology DS1819+ - Reboots During Data Scrubbing

Hello,

I have a pretty new DS1819+ that runs perfectly 24/7 except that during data scrubs the system will crash at a certain percentage, around 40% both times it's done it, during the process. When this happens all the lights will go out and it appears to 'soft' reboot and comes back up without me having to power it back on or anything like that. It will then optimize the file system and continue scrubbing as if nothing happened.

I'm running DSM 6.2.1-23824 Update 6. I'm wondering how to go about troubleshooting this. I'm thinking it may be a certain disk causing the problem as I have two different volumes, one with SSD's and one with HDD's. The SSD volume will scrub fine. I've done a rebuild of the RAID no problem with the HDD's as I copied over some data then added some more drives. The logs don't show me anything useful other than the system has booted from an improper shutdown.

Any ideas how I should proceed? Would support be able to get some better logs if I contacted them?

Thanks for the help.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/waterbed87
πŸ“…︎ Apr 01 2019
🚨︎ report
Hey reddit... my wife just stayed up with me until 4:17 a.m. -- SCRUBBING DATA and TESTING SOFTWARE that I wrote. She deserves to see this when she wakes up.

Only you guys could truly appreciate what an amazing show of love this is!

I have a big calendar system I'm deploying tonight, and I needed to import all the old data into it. Bless her heart, she whipped out Excel and scrubbed those tables for HOURS.

I adore her!

EDIT: Holy $h!tballs!! I posted this while on the verge of hallucinating last night, and here it is on the front page!

To all those who say "Hey, jerk, Reddit isn't for these personal posts," I say, "Sorry. I truly didn't think this was going anywhere. Truly!"

To all the rest of you, thanks for making her smile. I haven't figured out what I'm going to do for her yet, but it's definitely going to be nice. We -are- going on a trip to Mexico in a few weeks... maybe some possibilities there.

Oh yeah... and she woke me up at 1:00 this afternoon with blueberry muffins and carrot juice. Seriously? I don't deserve her.

(EDIT: "Fresh home-made blueberry ginger muffins, thank you!" She didn't want to be misrepresented as someone who uses box mixes.)

πŸ‘︎ 727
πŸ’¬︎
πŸ‘€︎ u/tynman
πŸ“…︎ Dec 18 2010
🚨︎ report
[Spoiler] Data scrubbing

https://preview.redd.it/jxncaijjmh931.jpg?width=744&format=pjpg&auto=webp&s=add6f30353f99927b422ba38c275e0a75f996d4b

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/pavonharten
πŸ“…︎ Jul 10 2019
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.