Java Code Examples for org.apache.jena.riot.RDFDataMgr#parse()
The following examples show how to use
org.apache.jena.riot.RDFDataMgr#parse() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: NtripleUtil.java From NLIWOD with GNU Affero General Public License v3.0 | 6 votes |
private static PipedRDFIterator<Triple> fileToStreamIterator(String filename) { PipedRDFIterator<Triple> iter = new PipedRDFIterator<>(); final PipedRDFStream<Triple> inputStream = new PipedTriplesStream(iter); // PipedRDFStream and PipedRDFIterator need to be on different threads ExecutorService executor = Executors.newSingleThreadExecutor(); // Create a runnable for our parser thread Runnable parser = new Runnable() { @Override public void run() { RDFDataMgr.parse(inputStream, filename); } }; // Start the parser on another thread executor.submit(parser); // We will consume the input on the main thread here // We can now iterate over data as it is parsed, parsing only runs as // far ahead of our consumption as the buffer size allows return iter; }
Example 2
Source File: Importer.java From fcrepo-import-export with Apache License 2.0 | 5 votes |
private Model parseStream(final InputStream in) throws IOException { final SubjectMappingStreamRDF mapper = new SubjectMappingStreamRDF(config.getSource(), config.getDestination()); try (final InputStream in2 = in) { RDFDataMgr.parse(mapper, in2, contentTypeToLang(config.getRdfLanguage())); } return mapper.getModel(); }
Example 3
Source File: AbstractTestDeltaLink.java From rdf-delta with Apache License 2.0 | 5 votes |
public void datasource_init_01() { DeltaLink dLink = getLink(); Id dsRef = dLink.newDataSource("datasource_15", "http://example/uri"); assertEquals(1, dLink.listDatasets().size()); DataSourceDescription dsd = dLink.getDataSourceDescriptionByURI("http://example/uri-not-present"); String url = dLink.initialState(dsRef); assertNotNull(url); RDFDataMgr.parse(StreamRDFLib.sinkNull(), url); }
Example 4
Source File: append.java From rdf-delta with Apache License 2.0 | 5 votes |
protected RDFPatch toPatch(String fn) { // .gz?? Lang lang = RDFLanguages.filenameToLang(fn); if ( lang != null && ( RDFLanguages.isTriples(lang) || RDFLanguages.isQuads(lang) ) ) { RDFChangesCollector x = new RDFChangesCollector(); StreamRDF dest = new RDF2Patch(x); // dest will do the start-finish on the RDFChangesCollector via parsing. RDFDataMgr.parse(dest, fn); return x.getRDFPatch(); } // Not RDF - assume a text patch. // String ext = FileOps.extension(fn); // switch(ext) { // case RDFPatchConst.EXT: // break; // case RDFPatchConst.EXT_B: // break; // default: // Log.warn(addpatch.class, "Conventionally, patches have file extension ."+RDFPatchConst.EXT); // } Path path = Paths.get(fn); try(InputStream in = Files.newInputStream(path) ) { return RDFPatchOps.read(in); } catch (IOException ex ) { throw IOX.exception(ex); } }
Example 5
Source File: RDFPatchOps.java From rdf-delta with Apache License 2.0 | 5 votes |
/** RDF data file to patch. * The patch has no Id or Previous - see {@link #withHeader}. */ public static RDFPatch rdf2patch(String rdfDataFile) { RDFChangesCollector x = new RDFChangesCollector(); RDF2Patch dest = new RDF2Patch(x); RDFDataMgr.parse(dest, rdfDataFile); // Headers. RDFPatch patch = x.getRDFPatch(); return patch; }
Example 6
Source File: InitialIndexTool.java From gerbil with GNU Affero General Public License v3.0 | 5 votes |
private static void indexStreamMem(Indexer index, InputStream in, SameAsCollectorStreamMem sink) { RDFDataMgr.parse(sink, in, Lang.TURTLE); LOGGER.info("Found {} instances of owl:sameAs", sink.getMapping().size()); for (String key : sink.getMapping().keySet()) { index.indexSameAs(key, sink.getMapping().get(key)); } }
Example 7
Source File: ExRIOT_7.java From xcurator with Apache License 2.0 | 5 votes |
public static void main(String... argv) { final String filename = "data.ttl"; CollectorStreamTriples inputStream = new CollectorStreamTriples(); RDFDataMgr.parse(inputStream, filename); for (Triple triple : inputStream.getCollected()) { System.out.println(triple); } }
Example 8
Source File: ExRIOT_4.java From xcurator with Apache License 2.0 | 5 votes |
public static void main(String...argv) { String filename = "data.ttl" ; // This is the heart of N-triples printing ... outoput is heavily buffered // so the FilterSinkRDF called flush at the end of parsing. Sink<Triple> output = new SinkTripleOutput(System.out, null, SyntaxLabels.createNodeToLabel()) ; StreamRDF filtered = new FilterSinkRDF(output, FOAF.name, FOAF.knows) ; // Call the parsing process. RDFDataMgr.parse(filtered, filename) ; }
Example 9
Source File: ExRIOT_6.java From xcurator with Apache License 2.0 | 5 votes |
public static void main(String... argv) { final String filename = "data.ttl"; // Create a PipedRDFStream to accept input and a PipedRDFIterator to // consume it // You can optionally supply a buffer size here for the // PipedRDFIterator, see the documentation for details about recommended // buffer sizes PipedRDFIterator<Triple> iter = new PipedRDFIterator<Triple>(); final PipedRDFStream<Triple> inputStream = new PipedTriplesStream(iter); // PipedRDFStream and PipedRDFIterator need to be on different threads ExecutorService executor = Executors.newSingleThreadExecutor(); // Create a runnable for our parser thread Runnable parser = new Runnable() { @Override public void run() { // Call the parsing process. RDFDataMgr.parse(inputStream, filename); } }; // Start the parser on another thread executor.submit(parser); // We will consume the input on the main thread here // We can now iterate over data as it is parsed, parsing only runs as // far ahead of our consumption as the buffer size allows while (iter.hasNext()) { Triple next = iter.next(); // Do something with each triple } }
Example 10
Source File: InitialIndexTool.java From gerbil with GNU Affero General Public License v3.0 | 3 votes |
private static void indexStream(Indexer index, InputStream in, SameAsCollectorStreamFile sink) throws IOException, GerbilException { // write everyhting to tmp file usiong SameAsCollectorStream RDFDataMgr.parse(sink, in, Lang.TURTLE); }