Is a shame we have idiots (most likely script kiddies who are incapable of writing a single line of code themselves) insisting on doing this, but sadly there is no foolproof method of completely obliterating the damage done. It can be reduced, but unfortunately with the largest possible attack size increasing rapidly (one notable group has done a 1.2 terabits per second attack when the last largest was only 400 gigabits) there is little that can be done cheaply. My own personal suggestions are to take a look at some freely available DDoS tools and write firewall rules to filter out attacks from those, invest in backend link protection (as it seems to be a trend to attack those links when usual attacks fail) and generally invest in servers with more RAM/faster network interfaces to mitigate against possible exhaustion attacks.
Is
costly that is true, but it should render most minor attacks from small groups useless, while giving a larger window of time to warn the community of issues.
09-Feb-2015 16:42:24