Java Code Examples for org.apache.lucene.store.IndexOutput#writeInt()
The following examples show how to use
org.apache.lucene.store.IndexOutput#writeInt() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: TestMixedDirectory.java From hadoop-gpu with Apache License 2.0 | 6 votes |
public void testMixedDirectoryAndPolicy() throws IOException { Directory readDir = new RAMDirectory(); updateIndex(readDir, 0, numDocsPerUpdate, new KeepOnlyLastCommitDeletionPolicy()); verify(readDir, numDocsPerUpdate); IndexOutput out = readDir.createOutput("_" + (numDocsPerUpdate / maxBufferedDocs + 2) + ".cfs"); out.writeInt(0); out.close(); Directory writeDir = new RAMDirectory(); Directory mixedDir = new MixedDirectory(readDir, writeDir); updateIndex(mixedDir, numDocsPerUpdate, numDocsPerUpdate, new MixedDeletionPolicy()); verify(readDir, numDocsPerUpdate); verify(mixedDir, 2 * numDocsPerUpdate); }
Example 2
Source File: TestMixedDirectory.java From RDFS with Apache License 2.0 | 6 votes |
public void testMixedDirectoryAndPolicy() throws IOException { Directory readDir = new RAMDirectory(); updateIndex(readDir, 0, numDocsPerUpdate, new KeepOnlyLastCommitDeletionPolicy()); verify(readDir, numDocsPerUpdate); IndexOutput out = readDir.createOutput("_" + (numDocsPerUpdate / maxBufferedDocs + 2) + ".cfs"); out.writeInt(0); out.close(); Directory writeDir = new RAMDirectory(); Directory mixedDir = new MixedDirectory(readDir, writeDir); updateIndex(mixedDir, numDocsPerUpdate, numDocsPerUpdate, new MixedDeletionPolicy()); verify(readDir, numDocsPerUpdate); verify(mixedDir, 2 * numDocsPerUpdate); }
Example 3
Source File: Text2Bin.java From lesk-wsd-dsm with GNU General Public License v3.0 | 6 votes |
/** * Convert a WordSpace text matrix to a bin WordSpace file * Text matrix format: * - the first line contains the matrix dimensions N * - each line contains the word vector information: word d1 d2 ... dN * Text2Bin text_matrix_file bin_matrix_file * @param args the command line arguments */ public static void main(String[] args) { try { BufferedReader in = new BufferedReader(new FileReader(args[0])); File file = new File(args[1]); FSDirectory fs = FSDirectory.open(file.getParentFile()); IndexOutput output = fs.createOutput(file.getName()); String header = in.readLine(); output.writeString("-dimensions"); output.writeInt(Integer.parseInt(header)); while (in.ready()) { String line = in.readLine(); String[] split = line.split("\t"); output.writeString(split[0]); for (int i=1;i<split.length;i++) { output.writeInt(Float.floatToIntBits(Float.parseFloat(split[i]))); } } in.close(); output.close(); } catch (IOException ex) { Logger.getLogger(Text2Bin.class.getName()).log(Level.SEVERE, null, ex); } }
Example 4
Source File: RealVector.java From semanticvectors with BSD 3-Clause "New" or "Revised" License | 6 votes |
/** * Writes vector out in dense format. If vector is originally sparse, writes out a copy so * that vector remains sparse. Truncates to length k. */ public void writeToLuceneStream(IndexOutput outputStream, int k) { float[] coordsToWrite; if (isSparse) { RealVector copy = copy(); copy.sparseToDense(); coordsToWrite = copy.coordinates; } else { coordsToWrite = coordinates; } for (int i = 0; i < k; ++i) { try { outputStream.writeInt(Float.floatToIntBits(coordsToWrite[i])); } catch (IOException e) { e.printStackTrace(); } } }
Example 5
Source File: RealVector.java From semanticvectors with BSD 3-Clause "New" or "Revised" License | 6 votes |
@Override /** * Writes vector out in dense format. If vector is originally sparse, writes out a copy so * that vector remains sparse. */ public void writeToLuceneStream(IndexOutput outputStream) { float[] coordsToWrite; if (isSparse) { RealVector copy = copy(); copy.sparseToDense(); coordsToWrite = copy.coordinates; } else { coordsToWrite = coordinates; } for (int i = 0; i < dimension; ++i) { try { outputStream.writeInt(Float.floatToIntBits(coordsToWrite[i])); } catch (IOException e) { e.printStackTrace(); } } }
Example 6
Source File: PermutationVector.java From semanticvectors with BSD 3-Clause "New" or "Revised" License | 6 votes |
@Override /** * Writes vector out in dense format. */ public void writeToLuceneStream(IndexOutput outputStream) { int[] coordsToWrite; coordsToWrite = coordinates; for (int i = 0; i < dimension; ++i) { try { outputStream.writeInt((coordsToWrite[i])); } catch (IOException e) { e.printStackTrace(); } } }
Example 7
Source File: ComplexVector.java From semanticvectors with BSD 3-Clause "New" or "Revised" License | 6 votes |
/** * Transforms vector to cartesian form and writes vector out in dense format, truncating the * vectors to the assigned dimensionality */ public void writeToLuceneStream(IndexOutput outputStream, int k) { toCartesian(); for (int i = 0; i < k * 2; ++i) { try { outputStream.writeInt(Float.floatToIntBits(coordinates[i])); } catch (IOException e) { e.printStackTrace(); } } /* DORMANT CODE! assert(opMode != MODE.POLAR_SPARSE); if (opMode == MODE.CARTESIAN) { cartesianToDensePolar(); } for (int i = 0; i < dimension; ++i) { try { outputStream.writeInt((int)(phaseAngles[i])); } catch (IOException e) { e.printStackTrace(); } } */ }
Example 8
Source File: HdfsDirectoryTest.java From lucene-solr with Apache License 2.0 | 6 votes |
public void testRename() throws IOException { String[] listAll = directory.listAll(); for (String file : listAll) { directory.deleteFile(file); } IndexOutput output = directory.createOutput("testing.test", new IOContext()); output.writeInt(12345); output.close(); directory.rename("testing.test", "testing.test.renamed"); assertFalse(slowFileExists(directory, "testing.test")); assertTrue(slowFileExists(directory, "testing.test.renamed")); IndexInput input = directory.openInput("testing.test.renamed", new IOContext()); assertEquals(12345, input.readInt()); assertEquals(input.getFilePointer(), input.length()); input.close(); directory.deleteFile("testing.test.renamed"); assertFalse(slowFileExists(directory, "testing.test.renamed")); }
Example 9
Source File: TestCodecUtil.java From lucene-solr with Apache License 2.0 | 6 votes |
public void testCheckFooterInvalid() throws Exception { ByteBuffersDataOutput out = new ByteBuffersDataOutput(); IndexOutput output = new ByteBuffersIndexOutput(out, "temp", "temp"); CodecUtil.writeHeader(output, "FooBar", 5); output.writeString("this is the data"); output.writeInt(CodecUtil.FOOTER_MAGIC); output.writeInt(0); output.writeLong(1234567); // write a bogus checksum output.close(); ChecksumIndexInput input = new BufferedChecksumIndexInput(new ByteBuffersIndexInput(out.toDataInput(), "temp")); CodecUtil.checkHeader(input, "FooBar", 5, 5); assertEquals("this is the data", input.readString()); Exception mine = new RuntimeException("fake exception"); CorruptIndexException expected = expectThrows(CorruptIndexException.class, () -> { CodecUtil.checkFooter(input, mine); }); assertTrue(expected.getMessage().contains("checksum failed")); Throwable suppressed[] = expected.getSuppressed(); assertEquals(1, suppressed.length); assertEquals("fake exception", suppressed[0].getMessage()); input.close(); }
Example 10
Source File: MultiInstancesHdfsDirectoryTest.java From incubator-retired-blur with Apache License 2.0 | 6 votes |
@Test public void testMultiInstancesHdfsDirectoryTest1() throws IOException, InterruptedException { HdfsDirectory dir1 = new HdfsDirectory(_configuration, new Path(_root, "dir")); IndexOutput output = dir1.createOutput("a", IOContext.DEFAULT); output.writeInt(1234); output.close(); HdfsDirectory dir2 = new HdfsDirectory(_configuration, new Path(_root, "dir")); IndexInput input = dir2.openInput("a", IOContext.READ); assertEquals(4, input.length()); assertEquals(1234, input.readInt()); input.close(); dir1.close(); dir2.close(); }
Example 11
Source File: BaseCompoundFormatTestCase.java From lucene-solr with Apache License 2.0 | 6 votes |
public void testMakeLockDisabled() throws IOException { final String testfile = "_123.test"; Directory dir = newDirectory(); IndexOutput out = dir.createOutput(testfile, IOContext.DEFAULT); out.writeInt(3); out.close(); SegmentInfo si = newSegmentInfo(dir, "_123"); si.setFiles(Collections.emptyList()); si.getCodec().compoundFormat().write(dir, si, IOContext.DEFAULT); Directory cfs = si.getCodec().compoundFormat().getCompoundReader(dir, si, IOContext.DEFAULT); expectThrows(UnsupportedOperationException.class, () -> { cfs.obtainLock("foobar"); }); cfs.close(); dir.close(); }
Example 12
Source File: BaseCompoundFormatTestCase.java From lucene-solr with Apache License 2.0 | 6 votes |
public void testSyncDisabled() throws IOException { final String testfile = "_123.test"; Directory dir = newDirectory(); IndexOutput out = dir.createOutput(testfile, IOContext.DEFAULT); out.writeInt(3); out.close(); SegmentInfo si = newSegmentInfo(dir, "_123"); si.setFiles(Collections.emptyList()); si.getCodec().compoundFormat().write(dir, si, IOContext.DEFAULT); Directory cfs = si.getCodec().compoundFormat().getCompoundReader(dir, si, IOContext.DEFAULT); expectThrows(UnsupportedOperationException.class, () -> { cfs.sync(Collections.singleton(testfile)); }); cfs.close(); dir.close(); }
Example 13
Source File: BaseCompoundFormatTestCase.java From lucene-solr with Apache License 2.0 | 6 votes |
public void testDeleteFileDisabled() throws IOException { final String testfile = "_123.test"; Directory dir = newDirectory(); IndexOutput out = dir.createOutput(testfile, IOContext.DEFAULT); out.writeInt(3); out.close(); SegmentInfo si = newSegmentInfo(dir, "_123"); si.setFiles(Collections.emptyList()); si.getCodec().compoundFormat().write(dir, si, IOContext.DEFAULT); Directory cfs = si.getCodec().compoundFormat().getCompoundReader(dir, si, IOContext.DEFAULT); expectThrows(UnsupportedOperationException.class, () -> { cfs.deleteFile(testfile); }); cfs.close(); dir.close(); }
Example 14
Source File: HdfsDirectoryTest.java From lucene-solr with Apache License 2.0 | 5 votes |
@Test public void testWritingAndReadingAFile() throws IOException { String[] listAll = directory.listAll(); for (String file : listAll) { directory.deleteFile(file); } IndexOutput output = directory.createOutput("testing.test", new IOContext()); output.writeInt(12345); output.close(); IndexInput input = directory.openInput("testing.test", new IOContext()); assertEquals(12345, input.readInt()); input.close(); listAll = directory.listAll(); assertEquals(1, listAll.length); assertEquals("testing.test", listAll[0]); assertEquals(4, directory.fileLength("testing.test")); IndexInput input1 = directory.openInput("testing.test", new IOContext()); IndexInput input2 = input1.clone(); assertEquals(12345, input2.readInt()); input2.close(); assertEquals(12345, input1.readInt()); input1.close(); assertFalse(slowFileExists(directory, "testing.test.other")); assertTrue(slowFileExists(directory, "testing.test")); directory.deleteFile("testing.test"); assertFalse(slowFileExists(directory, "testing.test")); }
Example 15
Source File: TestCodecUtil.java From lucene-solr with Apache License 2.0 | 5 votes |
public void testReadHeaderWrongMagic() throws Exception { ByteBuffersDataOutput out = new ByteBuffersDataOutput(); IndexOutput output = new ByteBuffersIndexOutput(out, "temp", "temp"); output.writeInt(1234); output.close(); IndexInput input = new ByteBuffersIndexInput(out.toDataInput(), "temp"); expectThrows(CorruptIndexException.class, () -> { CodecUtil.checkHeader(input, "bogus", 1, 1); }); }
Example 16
Source File: ComplexVector.java From semanticvectors with BSD 3-Clause "New" or "Revised" License | 5 votes |
@Override /** * Transforms vector to cartesian form and writes vector out in dense format. */ public void writeToLuceneStream(IndexOutput outputStream) { toCartesian(); for (int i = 0; i < dimension * 2; ++i) { try { outputStream.writeInt(Float.floatToIntBits(coordinates[i])); } catch (IOException e) { e.printStackTrace(); } } }
Example 17
Source File: IndexedDISI.java From lucene-solr with Apache License 2.0 | 5 votes |
private static short flushBlockJumps(int[] jumps, int blockCount, IndexOutput out, long origo) throws IOException { if (blockCount == 2) { // Jumps with a single real entry + NO_MORE_DOCS is just wasted space so we ignore that blockCount = 0; } for (int i = 0 ; i < blockCount ; i++) { out.writeInt(jumps[i*2]); // index out.writeInt(jumps[i*2+1]); // offset } // As there are at most 32k blocks, the count is a short // The jumpTableOffset will be at lastPos - (blockCount * Long.BYTES) return (short)blockCount; }
Example 18
Source File: BaseDirectoryTestSuite.java From incubator-retired-blur with Apache License 2.0 | 5 votes |
@Test public void testWritingAndReadingAFile() throws IOException { IndexOutput output = directory.createOutput("testing.test", IOContext.DEFAULT); output.writeInt(12345); output.flush(); output.close(); IndexInput input = directory.openInput("testing.test", IOContext.DEFAULT); assertEquals(12345, input.readInt()); input.close(); String[] listAll = directory.listAll(); assertEquals(1, listAll.length); assertEquals("testing.test", listAll[0]); assertEquals(4, directory.fileLength("testing.test")); IndexInput input1 = directory.openInput("testing.test", IOContext.DEFAULT); IndexInput input2 = (IndexInput) input1.clone(); assertEquals(12345, input2.readInt()); input2.close(); assertEquals(12345, input1.readInt()); input1.close(); assertFalse(directory.fileExists("testing.test.other")); assertTrue(directory.fileExists("testing.test")); directory.deleteFile("testing.test"); assertFalse(directory.fileExists("testing.test")); }
Example 19
Source File: HdfsDirectoryManifestFileCacheTest.java From incubator-retired-blur with Apache License 2.0 | 4 votes |
private void createFile(HdfsDirectory dir, String name) throws IOException { IndexOutput output = dir.createOutput(name, IOContext.DEFAULT); output.writeInt(1); output.close(); }
Example 20
Source File: CodecUtil.java From lucene-solr with Apache License 2.0 | 2 votes |
/** * Writes a codec footer, which records both a checksum * algorithm ID and a checksum. This footer can * be parsed and validated with * {@link #checkFooter(ChecksumIndexInput) checkFooter()}. * <p> * CodecFooter --> Magic,AlgorithmID,Checksum * <ul> * <li>Magic --> {@link DataOutput#writeInt Uint32}. This * identifies the start of the footer. It is always {@value #FOOTER_MAGIC}. * <li>AlgorithmID --> {@link DataOutput#writeInt Uint32}. This * indicates the checksum algorithm used. Currently this is always 0, * for zlib-crc32. * <li>Checksum --> {@link DataOutput#writeLong Uint64}. The * actual checksum value for all previous bytes in the stream, including * the bytes from Magic and AlgorithmID. * </ul> * * @param out Output stream * @throws IOException If there is an I/O error writing to the underlying medium. */ public static void writeFooter(IndexOutput out) throws IOException { out.writeInt(FOOTER_MAGIC); out.writeInt(0); writeCRC(out); }