Putting it all together: main()

Code the main function and all auxiliary functions into the files wet.h (declarations) and wet.cxx (definitions). The program should: read the given URL and the desired downloading recursion depth limit from the command line, then recursively download the given URL and all meaningful sub-links up to the given depth limit, whilst storing them as vertices and arcs of a digraph, which will then be printed out in GraphViz format.

Test your WET application by running
[./wet http://www.enseignement.polytechnique.fr/profs/informatique/Leo.Liberti/test.html4] and verifying that the output is

# graphviz output by WET (L. Liberti 2006)
digraph www_1199989821 {
  0 [ label = "www.enseignement.polytechnique.fr" ];
  1 [ label = "www.enseignement.polytechnique.fr" ];
  2 [ label = "www.enseignement.polytechnique.fr" ];
  3 [ label = "Thu Jan 10 19:30:21 2008", color = red ];
   0 -> 1;
   1 -> 0;
   1 -> 2;
   2 -> 0;
   2 -> 1;
   0 -> 2;
}
By saving wet's output as wetout.dot and running \fbox{\tt
dot -Tgif -o wetout.gif wetout.dot} we get Fig. 5.3, left. Reducing the depth level to 2 yields Fig. 5.3, right.
Figure 5.3: The neighbourhoods graphs (depth 4, left; depth 2, right).
% latex2html id marker 2865
\fbox{
\begin{minipage}{15cm}
\includegraphics[width=5cm]{wetout.eps}
\hfill
\includegraphics[width=5cm]{wetout2.eps}
\end{minipage}}



Subsections
Leo Liberti 2008-01-12