Ask questions, find answers and collaborate at work with Stack Overflow for Teams. Explore Teams Collectives™ on Stack Overflow Find centralized, trusted content and collaborate around the technologies you use most. Learn more about Collectives
grep -i -A 5 -B 5 'db_pd.Clients' eightygigsfile.sql This has been running for an hour on a fairly powerful linux server which is otherwise not overloaded. Any alternative to grep? Anything about my syntax that can be improved, (egrep,fgrep better?) The file is actually in a directory which is shared with a mount to another server but the actual diskspace is local so that shouldn't make any differ
Is that possible to use grep on a continuous stream? What I mean is sort of a tail -f <file> command, but with grep on the output in order to keep only the lines that interest me. I've tried tail -f <file> | grep pattern but it seems that grep can only be executed once tail finishes, that is to say never.
I often need to kill a process during programming. The way I do it now is: [~]$ ps aux | grep 'python csp_build.py' user 5124 1.0 0.3 214588 13852 pts/4 Sl+ 11:19 0:00 python csp_build.py user 5373 0.0 0.0 8096 960 pts/6 S+ 11:20 0:00 grep python csp_build.py [~]$ kill 5124 How can I extract the process id automatically and kill it in the same line? Like this: [~]$ ps aux | grep 'python csp_build.
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く