![]() |
NFT - Anybody here provide (or know of ) data scrubbing services? Referral $$$ paid
Would prefer to spend $$$ with fellow CPers.
We have 43 databases averaging 50K each that need to be cleaned up. Seeking consultant. Referral $$$ paid to your paypal account Thanks |
Let me know what kind of services you need. Are you in the KC Metro?
|
I got a gal that comes by when I call and cleans my pipes. She's pretty talented, I'm sure she could clean your databases too.......while wearing a french maid outfit.
|
Shoot me a PM
|
Can we keep your porn?
|
Cleaned up how?
|
|
This thread had potential.... but alas.
|
Better call Saul.
|
If its really bad The Wolf.
<iframe width="560" height="315" src="//www.youtube.com/embed/IgzFPOMjiC8" frameborder="0" allowfullscreen></iframe> |
Gonna need more details. Are you looking at something that could be automated or something that requires a manual once-over?
|
Thank you for your responses.
Here are the details: We are sending emails to list of insurance agents in all fifty states. The databases are public records. The first list we need cleaned up is from WisCONsin. (How the natives pronounce it.) http://oci.wi.gov/agentlic/agntlist.shtml Scroll down to bottom of page: Format 2 - Agents by Company Appointments There are 17 files. First one is al_1st-al.exe. We figured out how to separate columns, and remove the records without email addresses. 1) We need all the first and last name duplicates removed; and 2) words turned into lower cases. We would prefer a way to automate this whole process. If interested, please PM. Thanks |
*1) uniq file > newfile
2) tr '[:upper:]' '[:lower:]' < file > newfile *(Are you sure you want to remove dupes by name and not just the entire line? If there are two John Smith entries with different info you will lose one. Might be better to remove complete dupe lines instead?) </pre> |
Quote:
Good catch! Yes. Thank you! |
If you have a bash shell, this will work...
Code:
cat infile | cut -c101-135,136-160,350-420 | tr '[:upper:]' '[:lower:]' | sed 's/ \s\+/,/g' > outfile |
fdisk
|
Speak with Stryker - & I'd like to apologize in advance if you've already found a solution.
But Stryker will (at a minimum) be able to guide you to someone. I just don't recall for sure if he's still in KC Metro. Good luck to you! |
http://www.midlandhardware.com/thumb...x=350&maxy=350
This should scrub it for you. PM me for paypal fee, :) |
Quote:
|
Magnet
|
Couldn't you just use awk? Instead of cut? awk '{print $(field#)}' | then do the translation from upper to lower
|
All times are GMT -6. The time now is 03:19 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.