What's Actually Happening

Solr search index stops receiving updates from the data source. New documents are not indexed, and search results become stale.

The Error You'll See

No new documents:

```bash $ curl "http://localhost:8983/solr/mycore/select?q=*:*&rows=0"

{ "response": { "numFound": 10000 } } # Document count not increasing despite source updates ```

Data import failure:

```bash $ curl "http://localhost:8983/solr/mycore/dataimport?command=status"

{ "status": "idle", "messages": ["Time taken: 0s", "Total Requests made to DataSource: 0"] } ```

Update handler error:

xml
<lst name="error">
  <str name="msg">Error adding field 'field_name'='value'</str>
</lst>

Why This Happens

  1. 1.Schema mismatch - Field type incompatible with data
  2. 2.Data import config - DIH misconfigured or broken
  3. 3.Autocommit disabled - Changes not persisted
  4. 4.Update log full - Transaction log issues
  5. 5.Memory pressure - Indexing buffer exhausted
  6. 6.Field missing - Required field not provided

Step 1: Check Index Status

```bash # Check core status: curl "http://localhost:8983/solr/admin/cores?action=STATUS&core=mycore"

# Check document count: curl "http://localhost:8983/solr/mycore/select?q=*:*&rows=0"

# Check last update: curl "http://localhost:8983/solr/mycore/update/json?commit=true" -d '{}'

# Check index statistics: curl "http://localhost:8983/solr/mycore/admin/luke?numTerms=0"

# Check update handler stats: curl "http://localhost:8983/solr/mycore/admin/mbeans?cat=UPDATEHANDLER&stats=true"

# Check for errors: curl "http://localhost:8983/solr/mycore/admin/ping" ```

Step 2: Check Schema Configuration

```bash # View schema: curl "http://localhost:8983/solr/mycore/schema?wt=json" | jq '.schema.fields'

# Or read schema file: cat /var/solr/data/mycore/conf/managed-schema

# Check specific field: curl "http://localhost:8983/solr/mycore/schema/fields/title?wt=json"

# Common issues: # 1. Field not defined: curl -X POST "http://localhost:8983/solr/mycore/schema" -H 'Content-Type: application/json' -d '{ "add-field": { "name": "new_field", "type": "text_general", "indexed": true, "stored": true } }'

# 2. Wrong field type: # For date fields, must use date type # For numbers, use appropriate int/long/float/double

# Check unique key: curl "http://localhost:8983/solr/mycore/schema?wt=json" | jq '.schema.uniqueKey'

# Required fields: curl "http://localhost:8983/solr/mycore/schema?wt=json" | jq '.schema.fields[] | select(.required == true)' ```

Step 3: Check Data Import Handler

```bash # Check DIH status: curl "http://localhost:8983/solr/mycore/dataimport?command=status"

# Run full import: curl "http://localhost:8983/solr/mycore/dataimport?command=full-import&clean=true&commit=true"

# Run delta import: curl "http://localhost:8983/solr/mycore/dataimport?command=delta-import"

# Check DIH configuration: cat /var/solr/data/mycore/conf/data-config.xml

# Example DIH config: <dataConfig> <dataSource type="JdbcDataSource" driver="com.mysql.cj.jdbc.Driver" url="jdbc:mysql://localhost:3306/mydb" user="root" password="password"/> <document> <entity name="item" query="SELECT id, title, content FROM items" deltaQuery="SELECT id FROM items WHERE updated_at > '${dataimporter.last_index_time}'" deltaImportQuery="SELECT id, title, content FROM items WHERE id='${dataimporter.delta.id}'"> <field column="id" name="id"/> <field column="title" name="title"/> <field column="content" name="content"/> </entity> </document> </dataConfig>

# Check for DIH errors: curl "http://localhost:8983/solr/mycore/dataimport?command=status&verbose=true" ```

Step 4: Check Autocommit Settings

```bash # Check solrconfig.xml: cat /var/solr/data/mycore/conf/solrconfig.xml | grep -A 10 updateHandler

# Autocommit settings: <updateHandler class="solr.DirectUpdateHandler2"> <!-- Auto commit after 15 seconds --> <autoCommit> <maxTime>${solr.autoCommit.maxTime:15000}</maxTime> <openSearcher>false</openSearcher> </autoCommit>

<!-- Soft commit for near-real-time --> <autoSoftCommit> <maxTime>${solr.autoSoftCommit.maxTime:1000}</maxTime> </autoSoftCommit> </updateHandler>

# If autocommit disabled, documents not visible until manual commit: curl "http://localhost:8983/solr/mycore/update?commit=true"

# Enable autocommit: # Edit solrconfig.xml and restart: systemctl restart solr

# Or commit via API: curl -X POST "http://localhost:8983/solr/mycore/update?commit=true" -H 'Content-Type: application/json' -d '{}' ```

Step 5: Check Update Requests

```bash # Test direct document update: curl -X POST "http://localhost:8983/solr/mycore/update/json?commit=true" -H 'Content-Type: application/json' -d '[ { "id": "test-1", "title": "Test Document", "content": "Test content" } ]'

# Check if document added: curl "http://localhost:8983/solr/mycore/select?q=id:test-1"

# Check update logs: tail -f /var/solr/logs/solr.log | grep -i "update|error|exception"

# Common update errors: # - "unknown field" - field not in schema # - "Error adding field" - type mismatch # - "Document is missing mandatory uniqueKey field" - no ID

# Update with atomic operations: curl -X POST "http://localhost:8983/solr/mycore/update/json?commit=true" -H 'Content-Type: application/json' -d '[ { "id": "test-1", "title": {"set": "Updated Title"} } ]' ```

Step 6: Check Indexing Performance

```bash # Check Solr memory: curl "http://localhost:8983/solr/admin/info/system?wt=json" | jq '.system.memory'

# Check JVM memory: curl "http://localhost:8983/solr/admin/info/system?wt=json" | jq '.jvm.memory'

# If memory low, increase heap: # In solr.in.sh: SOLR_JAVA_MEM="-Xms4g -Xmx4g"

# Check indexing buffer: curl "http://localhost:8983/solr/mycore/admin/mbeans?cat=CORE&stats=true" | jq '.statistics.indexWriter'

# Check merge policy: cat /var/solr/data/mycore/conf/solrconfig.xml | grep -A 5 "mergePolicy"

# For high-volume indexing: <indexConfig> <ramBufferSizeMB>256</ramBufferSizeMB> <maxBufferedDocs>100000</maxBufferedDocs> <mergePolicy class="org.apache.lucene.index.TieredMergePolicy"> <int name="maxMergeAtOnce">10</int> <int name="segmentsPerTier">10</int> </mergePolicy> </indexConfig> ```

Step 7: Check Update Transaction Log

```bash # Check tlog location: ls -la /var/solr/data/mycore/tlog/

# Large tlog files indicate uncommitted changes: du -sh /var/solr/data/mycore/tlog/

# Force commit to clear tlog: curl -X POST "http://localhost:8983/solr/mycore/update?commit=true"

# Check tlog after commit: ls -la /var/solr/data/mycore/tlog/ # Should be empty or small

# If tlog corrupt: # Stop Solr: systemctl stop solr

# Remove tlog (may lose uncommitted data): rm -rf /var/solr/data/mycore/tlog/*

# Start Solr: systemctl start solr ```

Step 8: Check Replication

```bash # For SolrCloud or replication setup:

# Check replica status: curl "http://localhost:8983/solr/admin/collections?action=CLUSTERSTATUS&collection=mycollection"

# Check shard states: curl "http://localhost:8983/solr/admin/collections?action=LIST"

# Check replica in sync: curl "http://localhost:8983/solr/mycollection/admin/ping"

# If replica down: curl "http://localhost:8983/solr/admin/collections?action=ADDREPLICA&collection=mycollection&shard=shard1&node=localhost:8983_solr"

# Delete failed replica: curl "http://localhost:8983/solr/admin/collections?action=DELETEREPLICA&collection=mycollection&shard=shard1&replica=core_node1" ```

Step 9: Enable Debug Logging

```bash # Enable debug for update handler: curl -X POST "http://localhost:8983/solr/admin/info/logging" -d 'set=org.apache.solr.update:DEBUG'

# Watch logs: tail -f /var/solr/logs/solr.log

# Enable DIH debug: curl "http://localhost:8983/solr/mycore/dataimport?command=full-import&debug=true&verbose=true"

# Check debug output: curl "http://localhost:8983/solr/mycore/dataimport?command=status&verbose=true"

# Reset logging: curl -X POST "http://localhost:8983/solr/admin/info/logging" -d 'set=org.apache.solr.update:WARN' ```

Step 10: Monitor Solr Updates

```bash # Create monitoring script: cat << 'EOF' > /usr/local/bin/monitor-solr.sh #!/bin/bash

CORE="mycore" SOLR_URL="http://localhost:8983/solr"

echo "=== Core Status ===" curl -s "$SOLR_URL/admin/cores?action=STATUS&core=$CORE" | jq '.status.mycore.index'

echo "" echo "=== Document Count ===" curl -s "$SOLR_URL/$CORE/select?q=*:*&rows=0" | jq '.response.numFound'

echo "" echo "=== Update Handler Stats ===" curl -s "$SOLR_URL/$CORE/admin/mbeans?cat=UPDATEHANDLER&stats=true" | jq '.solr-mbeans.UPDATEHANDLER'

echo "" echo "=== DIH Status ===" curl -s "$SOLR_URL/$CORE/dataimport?command=status" | jq '.status, .messages'

echo "" echo "=== Memory Usage ===" curl -s "$SOLR_URL/admin/info/system?wt=json" | jq '.jvm.memory' EOF

chmod +x /usr/local/bin/monitor-solr.sh

# Solr Prometheus metrics: curl http://localhost:8983/solr/admin/metrics?group=core&prefix=UPDATE_HANDLER

# Key metrics: # UPDATE_HANDLER.updateHandler.adds # UPDATE_HANDLER.updateHandler.commits # UPDATE_HANDLER.updateHandler.errors

# Alert for no updates: - alert: SolrNoIndexUpdates expr: rate(solr_update_handler_adds[10m]) == 0 for: 30m labels: severity: warning annotations: summary: "Solr index not receiving updates" ```

Solr Index Update Checklist

CheckCommandExpected
Document countselect q=*:*Increasing
Schema fieldsschema fieldsCorrect types
DIH statusdataimport statusSuccess
Autocommitsolrconfig.xmlEnabled
Update errorssolr.logNone
Memoryadmin info systemAdequate
Tlog sizetlog directorySmall

Verify the Fix

```bash # After fixing index issues

# 1. Add test document curl -X POST "http://localhost:8983/solr/mycore/update/json?commit=true" -H 'Content-Type: application/json' -d '[{"id": "test-verify", "title": "Verify"}]' // Success

# 2. Query immediately curl "http://localhost:8983/solr/mycore/select?q=id:test-verify" // Document found

# 3. Check document count curl "http://localhost:8983/solr/mycore/select?q=*:*&rows=0" // Count increasing

# 4. Run DIH import curl "http://localhost:8983/solr/mycore/dataimport?command=full-import&clean=true" // Import completes

# 5. Check no errors curl "http://localhost:8983/solr/mycore/dataimport?command=status" // Status: success

# 6. Monitor update rate /usr/local/bin/monitor-solr.sh // Updates flowing ```

  • [Fix Elasticsearch Query Taking Too Long](/articles/fix-elasticsearch-query-taking-too-long)
  • [Fix MongoDB Index Build Stuck](/articles/fix-mongodb-index-build-stuck)
  • [Fix ClickHouse Query Memory Limit Exceeded](/articles/fix-clickhouse-query-memory-limit-exceeded)