How to Search (Case-Insensitive) in a Column Using Like Wildcard

How can I search (case-insensitive) in a column using LIKE wildcard?

SELECT  *
FROM trees
WHERE trees.`title` COLLATE UTF8_GENERAL_CI LIKE '%elm%'

Actually, if you add COLLATE UTF8_GENERAL_CI to your column's definition, you can just omit all these tricks: it will work automatically.

ALTER TABLE trees 
MODIFY COLUMN title VARCHAR(…) CHARACTER
SET UTF8 COLLATE UTF8_GENERAL_CI.

This will also rebuild any indexes on this column so that they could be used for the queries without leading '%'

SQL SELECT LIKE (Insensitive casing)

use LOWER Function in both (column and search word(s)). Doing it so, you assure that the even if in the query is something like %VaLuE%, it wont matter

select qt.*
from query_table qt
where LOWER(column_name) LIKE LOWER('%vAlUe%');

How to use LIKE for case insensitive search of partial terms

Thanks to @OldProgrammer: changing

SELECT * FROM log WHERE user LIKE '%dog%'

to

SELECT * FROM log WHERE user ILIKE '%dog%'

worked.

Perform a Case insensitive Like query in a case sensitive SQL Server database

You can use UPPER or LOWER functions to convert the values to the same case. For example:

SELECT *
FROM YourTable
WHERE UPPER(YourColumn) = UPPER('VALUE')

Alternatively, you can specify the collation manually when comparing:

SELECT *
FROM YourTable
WHERE YourColumn = 'VALUE' COLLATE SQL_Latin1_General_CP1_CI_AI

How to search the string in query with case insensitive on Clickhouse database?

There's no ILIKE operator. I think you can use lowerUTF8().

select id,comments from discussion where lowerUTF8(comments) LIKE '%Data not reflect%';

However, might be performance heavy as it will have to convert all comments values to lowercase.

Case insensitive search with wildcard in Elasticsearch

I made a new index and add mapping and settings as below :

{
"new_index5" : {
"aliases" : { },
"mappings" : {
"current" : {
"properties" : {
"did" : {
"type" : "integer"
},
"fil_date" : {
"type" : "double"
},
"file_nr" : {
"type" : "double"
},
"filing_date" : {
"type" : "double"
},
"id" : {
"type" : "integer"
},
"mark_identification" : {
"type" : "keyword",
"normalizer" : "lowercase_normalizer"
},
"mark_text" : {
"type" : "keyword",
"normalizer" : "lowercase_normalizer"
},
"mark_type_id" : {
"type" : "text"
},
"markdescr" : {
"type" : "text"
},
"markdescrtext" : {
"type" : "text"
},
"niceclmain" : {
"type" : "double"
},
"owname" : {
"type" : "keyword",
"normalizer" : "lowercase_normalizer"
},
"party_name" : {
"type" : "keyword",
"normalizer" : "lowercase_normalizer"
},
"primary_code" : {
"type" : "text"
},
"registration_date" : {
"type" : "double"
},
"registration_number" : {
"type" : "double"
},
"serial_number" : {
"type" : "double"
},
"status_code" : {
"type" : "text"
},
"statusapplication" : {
"type" : "text"
}
}
}
},
"settings" : {
"index" : {
"number_of_shards" : "5",
"provided_name" : "new_index5",
"creation_date" : "1527686957833",
"analysis" : {
"normalizer" : {
"lowercase_normalizer" : {
"filter" : [
"lowercase"
],
"type" : "custom",
"char_filter" : [ ]
}
}
},
"number_of_replicas" : "1",
"uuid" : "9YdUrs1cSBuqDJmvSPOm6g",
"version" : {
"created" : "6020499"
}
}
}
}
}

And Added aggregation in my query for first search case, like this :

GET _search
{
"query": {
"bool": {
"must" : [
{
"match": {
"mark_text": "smart"
}
}
]
}
},
"aggs": {
"mark_texts": {
"terms": {
"field": "mark_text"
}
}
}
}

It gives me results including "smart" and "SMART" both.

For second search case, I am using fuzzy.

I still don't know how aggregation and normalizers solved my probelm. But, I am trying to understand it.

LIKE with case sensitive wildcards

Try this instead:

LIKE '%[A-Z][A-Z][A-Z]%' COLLATE Latin1_General_Bin


Related Topics



Leave a reply



Submit