In the example script we implement a basic mapper which maps a text/plain mime type to an imaginary ontology, which extends the class Document from FOAF with properties 'txt:UniqueWords' and 'txt:Chars', where the prefix 'txt:' we specify as 'urn:txt:v0.0:'.

use DB;

create procedure DB.DBA.RDF_LOAD_TXT_META
         in graph_iri varchar,
         in new_origin_uri varchar,
         in dest varchar,
         inout ret_body any,
         inout aq any,
         inout ps any,
         inout ser_key any
  declare words, chars int;
  declare vtb, arr, subj, ses, str any;
  declare ses any;
  -- if any error we just say nothing can be done
  declare exit handler for sqlstate '*'
      return 0;
  subj := coalesce (dest, new_origin_uri);
  vtb := vt_batch ();
  chars := length (ret_body);
  -- using the text index procedures we get a list of words
  vt_batch_feed (vtb, ret_body, 1);
  arr := vt_batch_strings_array (vtb);
  -- the list has 'word' and positions array , so we must divide by 2
  words := length (arr) / 2;
  ses := string_output ();
  -- we compose a N3 literal
  http (sprintf ('<%s> <> <> .\n', subj), ses);
  http (sprintf ('<%s> <urn:txt:v0.0:UniqueWords> "%d" .\n', subj, words), ses);
  http (sprintf ('<%s> <urn:txt:v0.0:Chars> "%d" .\n', subj, chars), ses);
  str := string_output_string (ses);
  -- we push the N3 text into the local store
  DB.DBA.TTLP (str, new_origin_uri, subj);
  return 1;


    VALUES ('(text/plain)', 'MIME', 'DB.DBA.RDF_LOAD_TXT_META', null, 'Text Files (demo)');

-- here we set order to some large number so don't break existing mappers
  1. Paste the whole of this code into Conductor's iSQL interface and execute it to define and register the cartridge.

  2. Create a simple text document with a .txt extension. For ex. with name: summary.txt

  3. The .txt file must now be made Web accessible. A simple way to do this is to expose it as a WebDAV resource using Virtuoso's built-in WebDAV support:

    1. Log in to Virtuoso's ODS Briefcase application;

    2. Navigate to your Public folder;

    3. Upload your text document, ensuring that the file extension is .txt, the MIME type is set to text/plain and the file permissions are rw-r--r--.

    4. As result the file would be Web accessible via the URL http://cname/DAV/home/username/Public/summary.txt .

    5. Note: you can also check our live demo .

  4. To test the mapper we just use /sparql endpoint with option 'Retrieve remote RDF data for all missing source graphs' to execute (for ex.):

    SELECT *
    FROM <http://cname/DAV/home/username/Public/summary.txt>
    WHERE {?s ?p ?o}
  5. Click the "Run Query" button.

  6. As result should be shown the found triples, for ex.:

    s                                                      p                                                    o
    http://cname/DAV/home/username/Public/summary.txt  urn:txt:v0.0:UniqueWords                           47
    http://cname/DAV/home/username/Public/summary.txt  urn:txt:v0.0:Chars                               625

Important: Setting Sponger Permissions

In order to allow the Sponger to update the local RDF quad store with triples constituting the fetched Network Resource structured data, the role "SPARQL_SPONGE" must be granted to the account "SPARQL", i.e., to the owner account of /sparql web service endpoint. This should normally be the case. If not, you must manually grant this permission. As with most Virtuoso DBA tasks, the Conductor provides the simplest means of doing this.

[Tip] See Also:
  • The DB.DBA.RDF_LOAD_RDFXML function to parse the content of RDF/XML text.

  • The DB.DBA.TTLP_MT function to parse TTL (TURTLE or N3 resource).

  • The gz_file_open function to retrieve content of a gzipped file and example for loading gzipped N3 and Turtle files.