org.openrdf.rio.RDFWriter Java Examples
The following examples show how to use
org.openrdf.rio.RDFWriter.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: RDFModelFormater.java From ldp4j with Apache License 2.0 | 6 votes |
private RDFWriter createWriter(StringWriter writer) { RDFWriter result=null; if(format.equals(Format.TURTLE)) { result=new TurtlePrettyPrinter(new MemValueFactory().createURI(baseURI.toString()),writer); } else { RDFWriterRegistry registry=RDFWriterRegistry.getInstance(); RDFFormat rawFormat=Rio.getWriterFormatForMIMEType(format.getMime(),RDFFormat.RDFXML); RDFWriterFactory factory=registry.get(rawFormat); result=factory.getWriter(writer); if(format.equals(Format.JSON_LD)) { result.getWriterConfig().set(JSONLDSettings.JSONLD_MODE,JSONLDMode.FLATTEN); result.getWriterConfig().set(BasicWriterSettings.PRETTY_PRINT,true); } } return result; }
Example #2
Source File: SPARQLGraphStreamingOutput.java From neo4j-sparql-extension with GNU General Public License v3.0 | 6 votes |
/** * Called by JAX-RS upon building a response. * * @param out the {@link OutputStream} to write the triples to * @throws IOException if there was an error during communication * @throws WebApplicationException if there was an error while serialising */ @Override public void write(OutputStream out) throws IOException, WebApplicationException { try { RDFWriter writer = factory.getWriter(out); // evaluate query and stream result query.evaluate(writer); conn.close(); } catch (RepositoryException | QueryEvaluationException | RDFHandlerException ex) { // server error close(conn, ex); throw new WebApplicationException(ex); } }
Example #3
Source File: RDFStreamingOutput.java From neo4j-sparql-extension with GNU General Public License v3.0 | 6 votes |
/** * Called by JAX-RS upon building a response. * * @param out the {@link OutputStream} to write the triples to * @throws IOException if there was an error during communication * @throws WebApplicationException if there was an error while serialising */ @Override public void write(OutputStream out) throws IOException, WebApplicationException { try { // get an RDF writer to stream the triples RDFWriter writer = factory.getWriter(out); // export triples from graphs conn.export(writer, contexts); conn.close(); } catch (RepositoryException | RDFHandlerException ex) { // server error close(conn, ex); throw new WebApplicationException(ex); } }
Example #4
Source File: CustomSesameDataset.java From GeoTriples with Apache License 2.0 | 6 votes |
/** * Execute a CONSTRUCT/DESCRIBE SPARQL query against the graphs * * @param qs * CONSTRUCT or DESCRIBE SPARQL query * @param format * the serialization format for the returned graph * @return serialized graph of results */ public String runSPARQL(String qs, RDFFormat format) { try { RepositoryConnection con = currentRepository.getConnection(); try { GraphQuery query = con.prepareGraphQuery( org.openrdf.query.QueryLanguage.SPARQL, qs); StringWriter stringout = new StringWriter(); RDFWriter w = Rio.createWriter(format, stringout); query.evaluate(w); return stringout.toString(); } finally { con.close(); } } catch (Exception e) { e.printStackTrace(); } return null; }
Example #5
Source File: CustomSesameDataset.java From GeoTriples with Apache License 2.0 | 6 votes |
public String printRDF(RDFFormat outform) { try { RepositoryConnection con = currentRepository.getConnection(); try { ByteArrayOutputStream out = new ByteArrayOutputStream(); RDFWriter w = Rio.createWriter(outform, out); con.export(w); String result = new String(out.toByteArray(), "UTF-8"); return result; } finally { con.close(); } } catch (Exception e) { e.printStackTrace(); } return null; }
Example #6
Source File: CustomSesameDataset.java From GeoTriples with Apache License 2.0 | 5 votes |
/** * Dump RDF graph * * @param out * output stream for the serialization * @param outform * the RDF serialization format for the dump * @return */ public void dumpRDF(OutputStream out, RDFFormat outform) { try { RepositoryConnection con = currentRepository.getConnection(); try { RDFWriter w = Rio.createWriter(outform, out); con.export(w); } finally { con.close(); } } catch (Exception e) { e.printStackTrace(); } }
Example #7
Source File: RDFModelFormater.java From ldp4j with Apache License 2.0 | 5 votes |
protected String exportRepository(RepositoryConnection connection) throws RepositoryException, RDFHandlerException { StringWriter writer=new StringWriter(); RDFWriter rdfWriter=Rio.createWriter(getFormat(),writer); if(rdfWriter instanceof TurtleWriter) { rdfWriter=new RebasingTurtleWriter(writer); } connection.export(rdfWriter); return writer.toString(); }
Example #8
Source File: RepositoryModelSet.java From semweb4j with BSD 2-Clause "Simplified" License | 5 votes |
private void writeTo(RDFWriter writer) throws ModelRuntimeException { this.assertModel(); try { this.connection.export(writer); } catch(OpenRDFException e) { throw new ModelRuntimeException(e); } }
Example #9
Source File: RepositoryModel.java From semweb4j with BSD 2-Clause "Simplified" License | 5 votes |
@Override public void writeTo(OutputStream stream, Syntax syntax) throws // interface allows it IOException, ModelRuntimeException { RDFWriter rdfWriter = Rio.createWriter(getRDFFormat(syntax), stream); writeTo(rdfWriter); }
Example #10
Source File: TestUtils.java From cumulusrdf with Apache License 2.0 | 5 votes |
/** * Converts a statement iterator to an input stream with serialized RDF data. * * @param statements The statement iterator. * @param format The RDF format to use for serialization. * @return The serialized RDF data. * @throws RDFHandlerException in case of operation failure. */ public static InputStream statementIteratorToRdfStream(final Iterator<Statement> statements, final RDFFormat format) throws RDFHandlerException { ByteArrayOutputStream stream = new ByteArrayOutputStream(); RDFWriter rdfWriter = Rio.createWriter(format, stream); rdfWriter.startRDF(); while (statements.hasNext()) { rdfWriter.handleStatement(statements.next()); } rdfWriter.endRDF(); return new ByteArrayInputStream(stream.toByteArray()); }
Example #11
Source File: TripleStoreBlazegraph.java From powsybl-core with Mozilla Public License 2.0 | 5 votes |
private static void write(Model statements, OutputStream out) { try (PrintStream pout = new PrintStream(out)) { RDFWriter w = new RDFXMLPrettyWriter(pout); Rio.write(statements, w); } catch (Exception x) { throw new TripleStoreException("Writing model statements", x); } }
Example #12
Source File: ApplicationXmlTest.java From attic-polygene-java with Apache License 2.0 | 5 votes |
private void writeN3( Iterable<Statement> graph ) throws RDFHandlerException, IOException { RDFWriterFactory writerFactory = new N3WriterFactory(); RDFWriter writer = writerFactory.getWriter( System.out ); writeOutput( writer, graph ); }
Example #13
Source File: ApplicationXmlTest.java From attic-polygene-java with Apache License 2.0 | 5 votes |
private void writeXml( Iterable<Statement> graph ) throws RDFHandlerException, IOException { RDFWriterFactory writerFactory = new RDFXMLWriterFactory(); RDFWriter writer = writerFactory.getWriter( System.out ); writeOutput( writer, graph ); }
Example #14
Source File: ApplicationXmlTest.java From attic-polygene-java with Apache License 2.0 | 5 votes |
private void writeOutput( RDFWriter writer, Iterable<Statement> graph ) throws RDFHandlerException { writer.startRDF(); writer.handleNamespace( "polygene", PolygeneRdf.POLYGENE_MODEL ); writer.handleNamespace( "rdf", Rdfs.RDF ); writer.handleNamespace( "rdfs", Rdfs.RDFS ); for( Statement st : graph ) { writer.handleStatement( st ); } writer.endRDF(); }
Example #15
Source File: RdfIndexExporter.java From attic-polygene-java with Apache License 2.0 | 5 votes |
@Override public void exportReadableToStream( PrintStream out ) throws IOException { RDFWriter rdfWriter = Rio.createWriter( RDFFormat.TRIG, out ); exportToWriter( rdfWriter ); }
Example #16
Source File: RdfIndexExporter.java From attic-polygene-java with Apache License 2.0 | 5 votes |
@Override public void exportFormalToWriter( PrintWriter out ) throws IOException { RDFWriter rdfWriter = Rio.createWriter( RDFFormat.RDFXML, out ); exportToWriter( rdfWriter ); }
Example #17
Source File: LinkedDataServletTest.java From cumulusrdf with Apache License 2.0 | 4 votes |
@Override public RDFWriter getWriter(Writer writer) { return new HTMLWriter(writer); }
Example #18
Source File: SubSetCreator.java From mustard with MIT License | 4 votes |
/** * @param args */ public static void main(String[] args) { boolean[] inference = {false, true}; long[] seeds = {1,2,3,4,5,6,7,8,9,10}; double fraction = 0.05; int minSize = 0; int maxClasses = 3; int depth = 3; ///* String saveDir = "datasets/BGS_small/BGSsubset"; String loadDir = BGS_FOLDER; RDFFormat format = RDFFormat.NTRIPLES; //*/ /* String saveDir = "datasets/AMsubset"; String loadDir = AM_FOLDER; RDFFormat format = RDFFormat.TURTLE; //*/ RDFDataSet tripleStore = new RDFFileDataSet(loadDir, format); LargeClassificationDataSet ds = new BGSDataSet(tripleStore, "http://data.bgs.ac.uk/ref/Lexicon/hasTheme", 10, 0.05, 5, 3); //LargeClassificationDataSet ds = new AMDataSet(tripleStore, 10, 0.05, 5, 3, true); for (long seed : seeds) { for (boolean inf : inference) { ds.createSubSet(seed, fraction, minSize, maxClasses); System.out.println("Getting Statements..."); Set<Statement> stmts = RDFUtils.getStatements4Depth(tripleStore, ds.getRDFData().getInstances(), depth, inf); System.out.println("# Statements: " + stmts.size()); stmts.removeAll(new HashSet<Statement>(ds.getRDFData().getBlackList())); System.out.println("# Statements: " + stmts.size() + ", after blackList"); File dir = new File(saveDir + seed + inf); dir.mkdirs(); File file = new File(saveDir + seed + inf, "subset.ttl"); File instFile = new File(saveDir + seed + inf, "instances.txt"); File targetFile = new File(saveDir + seed + inf, "target.txt"); try { RDFWriter writer = Rio.createWriter(RDFFormat.TURTLE, new FileWriter(file)); FileWriter instWriter = new FileWriter(instFile); FileWriter targetWriter = new FileWriter(targetFile); writer.startRDF(); for (Statement stmt : stmts) { writer.handleStatement(stmt); } writer.endRDF(); for (Resource inst : ds.getRDFData().getInstances()) { instWriter.write(inst.toString() + "\n"); } instWriter.close(); for (Double target : ds.getTarget()) { targetWriter.write(target.toString() + "\n"); } targetWriter.close(); } catch (Exception e) { throw new RuntimeException(e); } } } }
Example #19
Source File: RepositoryModelSet.java From semweb4j with BSD 2-Clause "Simplified" License | 4 votes |
/** * Writes the whole ModelSet to the Writer. Depending on the Syntax the * context URIs might or might not be serialized. TriX should be able to * serialize contexts. */ @Override public void writeTo(Writer writer, Syntax syntax) throws IOException, ModelRuntimeException { RDFWriter rdfWriter = Rio.createWriter(getRDFFormat(syntax), writer); this.writeTo(rdfWriter); }
Example #20
Source File: RepositoryModelSet.java From semweb4j with BSD 2-Clause "Simplified" License | 4 votes |
/** * Writes the whole ModelSet to the OutputStream. Depending on the Syntax * the context URIs might or might not be serialized. TriX should be able to * serialize contexts. */ @Override public void writeTo(OutputStream out, Syntax syntax) throws IOException, ModelRuntimeException { RDFWriter rdfWriter = Rio.createWriter(getRDFFormat(syntax), out); this.writeTo(rdfWriter); }
Example #21
Source File: RepositoryModel.java From semweb4j with BSD 2-Clause "Simplified" License | 4 votes |
@Override public void writeTo(Writer writer, Syntax syntax) throws ModelRuntimeException { assertModel(); RDFWriter rdfWriter = Rio.createWriter(getRDFFormat(syntax), writer); writeTo(rdfWriter); }
Example #22
Source File: RemoteRepositoryBase.java From database with GNU General Public License v2.0 | 4 votes |
/** * Serialize an iteration of statements into a byte[] to send across the * wire. */ protected static byte[] serialize(final Iterable<? extends Statement> stmts, final RDFFormat format) throws Exception { final RDFWriterFactory writerFactory = RDFWriterRegistry.getInstance().get(format); final ByteArrayOutputStream baos = new ByteArrayOutputStream(); final RDFWriter writer = writerFactory.getWriter(baos); writer.startRDF(); for (Statement stmt : stmts) { writer.handleStatement(stmt); } writer.endRDF(); final byte[] data = baos.toByteArray(); return data; }
Example #23
Source File: QueryServlet.java From database with GNU General Public License v2.0 | 4 votes |
@Override public Void call() throws Exception { BigdataSailRepositoryConnection conn = null; try { conn = getQueryConnection(); String mimeType = null; RDFFormat format = null; if (mimeTypes!=null) { mimeTypesLoop: while(mimeTypes.hasMoreElements()) { for (String mt:mimeTypes.nextElement().split(",")) { mt = mt.trim(); RDFFormat fmt = RDFWriterRegistry.getInstance() .getFileFormatForMIMEType(mt); if (conn.getTripleStore().isQuads() && (mt.equals(RDFFormat.NQUADS.getDefaultMIMEType()) || mt.equals(RDFFormat.TURTLE.getDefaultMIMEType())) || !conn.getTripleStore().isQuads() && fmt != null) { mimeType = mt; format = fmt; break mimeTypesLoop; } } } } if (format==null) { if(conn.getTripleStore().isQuads()){ mimeType = RDFFormat.NQUADS.getDefaultMIMEType(); } else { mimeType = RDFFormat.NTRIPLES.getDefaultMIMEType(); } format = RDFWriterRegistry.getInstance() .getFileFormatForMIMEType(mimeType); } resp.setContentType(mimeType); final OutputStream os = resp.getOutputStream(); final RDFWriter w = RDFWriterRegistry.getInstance().get(format) .getWriter(os); RepositoryResult<Statement> stmts = null; try { w.startRDF(); stmts = conn.getStatements(s, p, o, includeInferred, c); while(stmts.hasNext()){ w.handleStatement(stmts.next()); } w.endRDF(); } finally { if (stmts != null) { stmts.close(); } os.flush(); os.close(); } return null; } finally { if (conn != null) { conn.close(); } } }
Example #24
Source File: LinkedDataServletTest.java From cumulusrdf with Apache License 2.0 | 4 votes |
@Override public RDFWriter getWriter(OutputStream out) { return new HTMLWriter(out); }
Example #25
Source File: Listener.java From cumulusrdf with Apache License 2.0 | 4 votes |
@Override public RDFWriter getWriter(Writer writer) { return new HTMLWriter(writer); }
Example #26
Source File: Listener.java From cumulusrdf with Apache License 2.0 | 4 votes |
@Override public RDFWriter getWriter(OutputStream out) { return new HTMLWriter(out); }
Example #27
Source File: ExportKB.java From database with GNU General Public License v2.0 | 4 votes |
/** * Exports all told statements associated with the last commit point for the * KB. * * @throws IOException * @throws SailException * @throws RDFHandlerException */ public void exportData() throws IOException, SailException, RDFHandlerException { prepare(); // final BigdataSail sail = new BigdataSail(kb); // try { // sail.initialize(); // final SailConnection conn = sail.getReadOnlyConnection(); // try { final CloseableIteration<? extends Statement, SailException> itr = conn .getStatements(null/* s */, null/* p */, null/* o */, includeInferred, new Resource[] {}/* contexts */); try { final File file = new File(kbdir, "data." + format.getDefaultFileExtension()+".gz"); System.out.println("Writing " + file); final OutputStream os = new GZIPOutputStream( new FileOutputStream(file)); try { final RDFWriter writer = RDFWriterRegistry .getInstance().get(format).getWriter(os); writer.startRDF(); while (itr.hasNext()) { final Statement stmt = itr.next(); writer.handleStatement(stmt); } writer.endRDF(); } finally { os.close(); } } finally { itr.close(); } // } finally { // conn.close(); // } // } finally { // sail.shutDown(); // } }
Example #28
Source File: BigdataRDFContext.java From database with GNU General Public License v2.0 | 4 votes |
@Override protected void doQuery(final BigdataSailRepositoryConnection cxn, final OutputStream os) throws Exception { final BigdataSailGraphQuery query = (BigdataSailGraphQuery) setupQuery(cxn); // Note: getQueryTask() verifies that format will be non-null. final RDFFormat format = RDFWriterRegistry.getInstance() .getFileFormatForMIMEType(mimeType); final RDFWriter w = RDFWriterRegistry.getInstance().get(format) .getWriter(os); query.evaluate(w); }
Example #29
Source File: BigdataSPARQLResultsJSONWriterForConstructFactory.java From database with GNU General Public License v2.0 | 4 votes |
@Override public RDFWriter getWriter(final OutputStream out) { return new BigdataSPARQLResultsJSONWriterForConstruct(out); }
Example #30
Source File: BigdataSPARQLResultsJSONWriterForConstructFactory.java From database with GNU General Public License v2.0 | 4 votes |
@Override public RDFWriter getWriter(final Writer writer) { return new BigdataSPARQLResultsJSONWriterForConstruct(writer); }