Java Code Examples for org.apache.solr.client.solrj.SolrClient#deleteByQuery()
The following examples show how to use
org.apache.solr.client.solrj.SolrClient#deleteByQuery() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: DirectSolrClassicInputDocumentWriter.java From hbase-indexer with Apache License 2.0 | 6 votes |
/** * Has the same behavior as {@link org.apache.solr.client.solrj.SolrClient#deleteByQuery(String)}. * * @param deleteQuery delete query to be executed */ @Override public void deleteByQuery(String deleteQuery) throws SolrServerException, IOException { try { for (SolrClient server : solrServers) { server.deleteByQuery(deleteQuery); } } catch (SolrException e) { if (isDocumentIssue(e)) { documentDeleteErrorMeter.mark(1); } else { solrDeleteErrorMeter.mark(1); throw e; } } catch (SolrServerException sse) { solrDeleteErrorMeter.mark(1); throw sse; } }
Example 2
Source File: Indexing.java From jate with GNU Lesser General Public License v3.0 | 6 votes |
public static void main(String[] args) throws IOException, JATEException, SolrServerException { Logger logger = Logger.getLogger(Indexing.class.getName()); JATEProperties prop = new JATEProperties(args[0]); boolean deletePrevious = Boolean.valueOf(args[2]); SolrClient solrClient = new EmbeddedSolrServer(Paths.get(args[3]), args[4]); try { if (deletePrevious) { logger.info("DELETING PREVIOUS INDEX"); solrClient.deleteByQuery("*:*"); solrClient.commit(); } logger.info("INDEXING BEGINS"); IndexingHandler m = new IndexingHandler(); List<String> files = new ArrayList<>(); for (File f : new File(args[1]).listFiles()) files.add(f.toString()); m.index(files, prop.getIndexerMaxUnitsToCommit(), new TikaSimpleDocumentCreator(), solrClient, prop); logger.info("INDEXING COMPLETE"); System.exit(0); } finally { solrClient.close(); } }
Example 3
Source File: BackupRestoreUtils.java From lucene-solr with Apache License 2.0 | 6 votes |
public static int indexDocs(SolrClient masterClient, String collectionName, long docsSeed) throws IOException, SolrServerException { masterClient.deleteByQuery(collectionName, "*:*"); Random random = new Random(docsSeed);// use a constant seed for the whole test run so that we can easily re-index. int nDocs = TestUtil.nextInt(random, 1, 100); log.info("Indexing {} test docs", nDocs); List<SolrInputDocument> docs = new ArrayList<>(nDocs); for (int i = 0; i < nDocs; i++) { SolrInputDocument doc = new SolrInputDocument(); doc.addField("id", i); doc.addField("name", "name = " + i); docs.add(doc); } masterClient.add(collectionName, docs); masterClient.commit(collectionName); return nDocs; }
Example 4
Source File: RootFieldTest.java From lucene-solr with Apache License 2.0 | 6 votes |
@Test public void testUpdateWithChildDocs() throws Exception { SolrClient client = getSolrClient(); client.deleteByQuery("*:*");// delete everything! // Add child free doc SolrInputDocument docToUpdate = new SolrInputDocument(); String docId = "11"; docToUpdate.addField( "id", docId); docToUpdate.addField( "name", "parent doc with a child" ); SolrInputDocument child = new SolrInputDocument(); child.addField("id", "111"); child.addField("name", "child doc"); docToUpdate.addChildDocument(child); if (!useRootSchema) { thrown.expect(SolrException.class); thrown.expectMessage("Unable to index docs with children:" + " the schema must include definitions for both a uniqueKey field" + " and the '_root_' field, using the exact same fieldType"); } client.add(docToUpdate); client.commit(); }
Example 5
Source File: AbstractAlfrescoDistributedIT.java From SearchServices with GNU Lesser General Public License v3.0 | 5 votes |
/** * Delele by query on all Clients */ public static void deleteByQueryAllClients(String q) throws Exception { List<SolrClient> clients = getStandaloneAndShardedClients(); for (SolrClient client : clients) { client.deleteByQuery(q); } }
Example 6
Source File: MCRSolrIndexer.java From mycore with GNU General Public License v3.0 | 5 votes |
public static void dropIndexByType(String type, SolrClient client) throws Exception { if (!MCRObjectID.isValidType(type) || "data_file".equals(type)) { LOGGER.warn("The type {} is not a valid type in the actual environment", type); return; } LOGGER.info("Dropping solr index for type {}...", type); String deleteQuery = new MessageFormat("objectType:{0} _root_:*_{1}_*", Locale.ROOT) .format(new Object[] { type, type }); client.deleteByQuery(deleteQuery, BATCH_AUTO_COMMIT_WITHIN_MS); LOGGER.info("Dropping solr index for type {}...done", type); }
Example 7
Source File: MCRSolrClassificationUtil.java From mycore with GNU General Public License v3.0 | 5 votes |
/** * Drops the whole solr classification index. */ public static void dropIndex() { try { SolrClient solrClient = getCore().getConcurrentClient(); solrClient.deleteByQuery("*:*"); } catch (Exception exc) { LOGGER.error("Unable to drop solr classification index", exc); } }
Example 8
Source File: HBaseMapReduceIndexerTool.java From hbase-indexer with Apache License 2.0 | 5 votes |
private void clearSolr(Map<String, String> indexConnectionParams) throws SolrServerException, IOException { Set<SolrClient> servers = createSolrClients(indexConnectionParams); for (SolrClient server : servers) { server.deleteByQuery("*:*"); server.commit(false, false); try { server.close(); } catch (java.io.IOException e) { throw new RuntimeException(e); } } }
Example 9
Source File: UsingSolrJRefGuideExamplesTest.java From lucene-solr with Apache License 2.0 | 5 votes |
@After @Override public void tearDown() throws Exception { super.tearDown(); ensureNoLeftoverOutputExpectations(); final SolrClient client = getSolrClient(); client.deleteByQuery("techproducts", "*:*"); client.commit("techproducts"); }
Example 10
Source File: SolrTestCaseHS.java From lucene-solr with Apache License 2.0 | 5 votes |
public void deleteByQuery(String query, ModifiableSolrParams params) throws IOException, SolrServerException { if (local()) { assertU(delQ(query)); // todo - handle extra params return; } for (SolrClient client : provider.all()) { client.deleteByQuery(query); // todo - handle extra params } }
Example 11
Source File: AbstractSolrMorphlineTest.java From kite with Apache License 2.0 | 4 votes |
private void deleteAllDocuments() throws SolrServerException, IOException { collector.reset(); SolrClient s = solrClient; s.deleteByQuery("*:*"); // delete everything! s.commit(); }
Example 12
Source File: TestSimLargeCluster.java From lucene-solr with Apache License 2.0 | 4 votes |
@Test public void testFreediskTracking() throws Exception { int NUM_DOCS = 100000; String collectionName = "testFreeDisk"; SolrClient solrClient = cluster.simGetSolrClient(); CollectionAdminRequest.Create create = CollectionAdminRequest.createCollection(collectionName, "conf",2, 2); create.process(solrClient); CloudUtil.waitForState(cluster, "Timed out waiting for replicas of new collection to be active", collectionName, CloudUtil.clusterShape(2, 2, false, true)); ClusterState clusterState = cluster.getClusterStateProvider().getClusterState(); DocCollection coll = clusterState.getCollection(collectionName); Set<String> nodes = coll.getReplicas().stream() .map(r -> r.getNodeName()) .collect(Collectors.toSet()); Map<String, Number> initialFreedisk = getFreeDiskPerNode(nodes); // test small updates for (int i = 0; i < NUM_DOCS; i++) { SolrInputDocument doc = new SolrInputDocument("id", "id-" + i); solrClient.add(collectionName, doc); } Map<String, Number> updatedFreedisk = getFreeDiskPerNode(nodes); double delta = getDeltaFreeDiskBytes(initialFreedisk, updatedFreedisk); // 2 replicas - twice as much delta assertEquals(SimClusterStateProvider.DEFAULT_DOC_SIZE_BYTES * NUM_DOCS * 2, delta, delta * 0.1); // test small deletes - delete half of docs for (int i = 0; i < NUM_DOCS / 2; i++) { solrClient.deleteById(collectionName, "id-" + i); } Map<String, Number> updatedFreedisk1 = getFreeDiskPerNode(nodes); double delta1 = getDeltaFreeDiskBytes(initialFreedisk, updatedFreedisk1); // 2 replicas but half the docs assertEquals(SimClusterStateProvider.DEFAULT_DOC_SIZE_BYTES * NUM_DOCS * 2 / 2, delta1, delta1 * 0.1); // test bulk delete solrClient.deleteByQuery(collectionName, "*:*"); Map<String, Number> updatedFreedisk2 = getFreeDiskPerNode(nodes); double delta2 = getDeltaFreeDiskBytes(initialFreedisk, updatedFreedisk2); // 0 docs - initial freedisk if (log.isInfoEnabled()) { log.info(cluster.dumpClusterState(true)); } assertEquals(0.0, delta2, delta2 * 0.1); // test bulk update UpdateRequest ureq = new UpdateRequest(); ureq.setDocIterator(new FakeDocIterator(0, NUM_DOCS)); ureq.process(solrClient, collectionName); Map<String, Number> updatedFreedisk3 = getFreeDiskPerNode(nodes); double delta3 = getDeltaFreeDiskBytes(initialFreedisk, updatedFreedisk3); assertEquals(SimClusterStateProvider.DEFAULT_DOC_SIZE_BYTES * NUM_DOCS * 2, delta3, delta3 * 0.1); }
Example 13
Source File: RootFieldTest.java From lucene-solr with Apache License 2.0 | 4 votes |
@Test public void testLegacyBlockProcessing() throws Exception { SolrClient client = getSolrClient(); client.deleteByQuery("*:*");// delete everything! // Add child free doc SolrInputDocument docToUpdate = new SolrInputDocument(); String docId = "11"; docToUpdate.addField( "id", docId); docToUpdate.addField( "name", "child free doc" ); client.add(docToUpdate); client.commit(); SolrQuery query = new SolrQuery(); query.setQuery( "*:*" ); query.set( CommonParams.FL, "id,name,_root_" ); SolrDocumentList results = client.query(query).getResults(); assertThat(results.getNumFound(), is(1L)); SolrDocument foundDoc = results.get( 0 ); // Check retrieved field values assertThat(foundDoc.getFieldValue( "id" ), is(docId)); assertThat(foundDoc.getFieldValue( "name" ), is("child free doc")); String expectedRootValue = expectRoot() ? docId : null; assertThat(MESSAGE, foundDoc.getFieldValue( "_root_" ), is(expectedRootValue)); // Update the doc docToUpdate.setField( "name", "updated doc" ); client.add(docToUpdate); client.commit(); results = client.query(query).getResults(); assertEquals( 1, results.getNumFound() ); foundDoc = results.get( 0 ); // Check updated field values assertThat(foundDoc.getFieldValue( "id" ), is(docId)); assertThat(foundDoc.getFieldValue( "name" ), is("updated doc")); assertThat(MESSAGE, foundDoc.getFieldValue( "_root_" ), is(expectedRootValue)); }
Example 14
Source File: BaseDistributedSearchTestCase.java From lucene-solr with Apache License 2.0 | 4 votes |
protected void del(String q) throws Exception { controlClient.deleteByQuery(q); for (SolrClient client : clients) { client.deleteByQuery(q); } }
Example 15
Source File: SolrExampleStreamingBinaryTest.java From lucene-solr with Apache License 2.0 | 4 votes |
@Test public void testQueryAndStreamResponse() throws Exception { // index a simple document with one child SolrClient client = getSolrClient(); client.deleteByQuery("*:*"); SolrInputDocument child = new SolrInputDocument(); child.addField("id", "child"); child.addField("type_s", "child"); child.addField("text_s", "text"); SolrInputDocument parent = new SolrInputDocument(); parent.addField("id", "parent"); parent.addField("type_s", "parent"); parent.addChildDocument(child); client.add(parent); client.commit(); // create a query with child doc transformer SolrQuery query = new SolrQuery("{!parent which='type_s:parent'}text_s:text"); query.addField("*,[child parentFilter='type_s:parent']"); // test regular query QueryResponse response = client.query(query); assertEquals(1, response.getResults().size()); SolrDocument parentDoc = response.getResults().get(0); assertEquals(1, parentDoc.getChildDocumentCount()); // test streaming final List<SolrDocument> docs = new ArrayList<>(); client.queryAndStreamResponse(query, new StreamingResponseCallback() { @Override public void streamSolrDocument(SolrDocument doc) { docs.add(doc); } @Override public void streamDocListInfo(long numFound, long start, Float maxScore) { } }); assertEquals(1, docs.size()); parentDoc = docs.get(0); assertEquals(1, parentDoc.getChildDocumentCount()); }
Example 16
Source File: SolrExampleStreamingBinaryHttp2Test.java From lucene-solr with Apache License 2.0 | 4 votes |
@Test public void testQueryAndStreamResponse() throws Exception { // index a simple document with one child SolrClient client = getSolrClient(); client.deleteByQuery("*:*"); SolrInputDocument child = new SolrInputDocument(); child.addField("id", "child"); child.addField("type_s", "child"); child.addField("text_s", "text"); SolrInputDocument parent = new SolrInputDocument(); parent.addField("id", "parent"); parent.addField("type_s", "parent"); parent.addChildDocument(child); client.add(parent); client.commit(); // create a query with child doc transformer SolrQuery query = new SolrQuery("{!parent which='type_s:parent'}text_s:text"); query.addField("*,[child parentFilter='type_s:parent']"); // test regular query QueryResponse response = client.query(query); assertEquals(1, response.getResults().size()); SolrDocument parentDoc = response.getResults().get(0); assertEquals(1, parentDoc.getChildDocumentCount()); // test streaming final List<SolrDocument> docs = new ArrayList<>(); client.queryAndStreamResponse(query, new StreamingResponseCallback() { @Override public void streamSolrDocument(SolrDocument doc) { docs.add(doc); } @Override public void streamDocListInfo(long numFound, long start, Float maxScore) { } }); assertEquals(1, docs.size()); parentDoc = docs.get(0); assertEquals(1, parentDoc.getChildDocumentCount()); }
Example 17
Source File: MCRSolrCategLinkService.java From mycore with GNU General Public License v3.0 | 4 votes |
/** * Delete the given reference in solr. */ protected void delete(SolrClient solrClient, MCRCategLinkReference reference) throws SolrServerException, IOException { solrClient.deleteByQuery("+type:link +object:" + reference.getObjectID(), 500); }
Example 18
Source File: MCRSolrIndexer.java From mycore with GNU General Public License v3.0 | 4 votes |
/** * Drops the current solr index. */ public static void dropIndex(SolrClient client) throws Exception { LOGGER.info("Dropping solr index..."); client.deleteByQuery("*:*", BATCH_AUTO_COMMIT_WITHIN_MS); LOGGER.info("Dropping solr index...done"); }
Example 19
Source File: MCRSolrIndexer.java From mycore with GNU General Public License v3.0 | 3 votes |
/** * Deletes nested orphaned nested documents. * * https://issues.apache.org/jira/browse/SOLR-6357 * * @return the response or null if {@link MCRSolrUtils#useNestedDocuments()} returns false * @throws SolrServerException solr server exception * @throws IOException io exception */ public static UpdateResponse deleteOrphanedNestedDocuments(SolrClient solrClient) throws SolrServerException, IOException { if (!MCRSolrUtils.useNestedDocuments()) { return null; } return solrClient.deleteByQuery("-({!join from=id to=_root_ score=none}_root_:*) +_root_:*", 0); }