If you want to search and replace text among lots of files, Perl can make your life a lot simpler. Here is a script to replace the string ‘old_text’ with ‘new_text’ in any file ending with .txt. The script would have to be run from the same directory as the files which have the text you want replaced.
#!/usr/bin/perl @files = <*.txt>; foreach $file ( @files ) { system( "perl -p -i -e 's/old_text/new_text/g' $file" ); }
In order to use this code, copy it to a file and save it, then run the following command on the file.
chmod 755 <filename>
If you have the GNU version of sed, it’s alot easier to just
sed -i ‘s/old_text/new_text/’ *.txt
That’s a good point.
I tend to use a perl script when I use regular expressions to match and modify the strings instead of simple text to text.
Also, you can use a one-liner in perl with:
perl -p -i -e ‘s/old_text/new_text/g;’
It seems odd to me to make a perl system call calling perl. There are other ways to replace the text, but as Perl is very flexible, this works too.
A little script that goes around replacing text in a large number of files? I for one can’t think or anything that could go hideously, hideously wrong with that? :-)
It makes life a whole lot easier when the alternative is doing it by hand.
Horribly dangerous, but I’ve used a variation of that myself:
perl -p -i -e ‘s/foo/bar/gi’ find . -name ‘*.html’ -print
One thing you can do to lessen the chance of making a horrid mistake is to add an extension to the -i option (like .bak). That way at least you maintain the original and can diff the two files to see what got changed.