Next Previous Contents

3. The Multi Interface

The ``multi'' interface permits transfers to take place in an asynchronous manner. It consists of the functions:

    Curl_Multi_Type curl_multi_new ();
    curl_multi_remove_handle (Curl_Multi_Type mobj, Curl_Type obj);
    curl_multi_add_handle (Curl_Multi_Type mobj, Curl_Type obj);
    curl_multi_close (Curl_Multi_Type mobj);
    Int_Type curl_multi_perform (Curl_Multi_Type obj [,Double_Type timeout]);
    curl_multi_info_read(Curl_Multi_Type obj);

A Curl_Multi_Type object is essentially a collection of Curl_Type objects. As such one cannot understand the multi-interface without understanding the easy-interface.

As the name suggests, the curl_multi_new function creates an instance of a Curl_Multi_Type object. The curl_multi_add_handle function is used to add a Curl_Type object to the specified Curl_Multi_Type. Similarly, the curl_multi_remove_handle is used to remove a Curl_Type. Although the module automatically destroys the underlying Curl_Multi_Type object when it goes out of scope, the curl_multi_close may be used to explicitly perform this operation.

The curl_multi_perform function is used to carry out the actions of the Curl_Type objects associated with the Curl_Multi_Type object. The curl_multi_info_read may be used to find the result of the curl_multi_perform function.

To illustrate the use of these functions, consider once again the last example of the previous section involving the processing of a list of URLs. Here it is again except written to use the ``multi'' interface:

{
   urls = {"http://servantes.fictional.domain/don/quixote.html",
           "http://servantes.fictional.domain/sancho/panza.html"};
   fp_list = Assoc_Type[];
   m = curl_multi_new ();
   foreach url (urls)
     {
        file = path_basename (url);
        fp = fopen (file, "w");
        fp_list [url] = fp;
        c = curl_new (url);
        curl_setopt (c, CURLOPT_WRITEFUNCTION, &write_callback, fp);
        curl_multi_add_handle (m, c);
     }
   dt = 5.0;
   while (last_n = curl_multi_length (m), last_n > 0)
     {
        n = curl_multi_perform (m, dt);
        if (n == last_n)
          continue;
        while (c = curl_multi_info_read (m, &status), c!=NULL)
          {
             curl_multi_remove_handle (m, c);
             url = curl_get_url (c);
             () = fclose (fp_list[url]);
             if (status == 0)
               vmessage ("Retrieved %s", url);
             else 
               vmessage ("Unable to retrieve %s", url);
          }
     }
The above code fragment consists of two stages: The first stage involves the creation of individual Curl_Type objects via curl_new and populating the Curl_Multi_Type object assigned to the variable m using the the curl_multi_add_handle function. Also during this stage, files were opened and the resulting file pointers were placed in an associative array for later use.

The second stage involves a nested ``while'' loop. The outer loop will continue to run as long as there are still Curl_Type objects contained in m. The curl_multi_length function returns the number of such objects. Each time through the loop, the curl_multi_perform function is called with a time-out value of 5.0 seconds. This means that the function will wait up to 5.0 seconds for input on one of the underlying curl objects before returning. It returns the number of such objects that are still active. If that number is less than the number contained in the multi-type object, then at least one of the objects has finished processing.

The inner-loop of the second stage will execute if the transfer of an object has taken place. This loop repeatedly calls the curl_multi_info_read to obtain a completed Curl_Type object. In the body of the loop, the object is removed from the multi-type and the file associated with the URL is closed. The processing status, which is returned by curl_multi_info_read through its argument list, is also checked at this time.

Although this seems like a lot of complexity compared to the ``easy'' approach taken earlier, the reward is greater. Since the transfers are performed asynchronously, the time spent downloading the entire list of URLs can be a fraction of that of the synchronous approach.


Next Previous Contents