The problem: tons of log files with multiple columns and thousands of rows, and you only need two or three of those columns.
So what do you do, open every single file, select the columns you want, copy and paste the columns into a new spreadsheet? No.
The solution: SQL, or more specifically Log Parser. It already comes with many queries and scripts you can use, and it also allows you to use SQL to sift through many different types of files. Don't know SQL? No problem, it's easy and the below command will start you off in the right direction (make sure you select the correct file type next to the Log Type option):
- SELECT hostname, ipaddress FROM '\Files\*.csv'
It may ask you to specify a folder when you attempt to run the query. SQL uses three basic commands to search for data; SELECT, FROM, and WHERE. The above command is basically asking to select the hostname and ipaddress columns from the following files (in this case the files are in the \Files folder), and returning the results.
After the query is finished the results will be displayed in the top window, which you can then export the to a new file. No more opening every single file and searching for columns; this simple tool and query can be used for many different purposes.
Further reading:
No comments:
Post a Comment
Please feel free to leave a comment that is relevant to the post.