TL;DR - which privacy-focused search engine do people recommend, preferably one that can also easily be used as a default option in Safari?
I ditched Google in about 2016ish I would guess, and since then have used DDG as my default search engine.
As someone entrenched in the Apple ecosystem, it’s always seemed like a sound choice, as it’s one of the search engines built in to Safari on both iOS and macOS.
After spending a bit more time recently playing around with and updating my Docker containers, I started hosting a Whoogle container, which seemed to work pretty well, but I don’t see many out there talking about it, so not sure how good it actually is. I then tried a SearXNG container, but either had it misconfigured or just wasn’t getting many search results back.
At the moment I’m trying out Startpage, but I know there are potential privacy concerns since they were part-bought in 2019 by a US ad-tech company.
I’m also playing around with different browsers at the moment, flicking between Safari, Firefox and Brave. At which point I stumbled across Brave Search, which seems pretty promising.
So, which search engines do you all recommend?
UPDATE: Probably should’ve done a poll! But latest (if I’ve captured everything correctly) is:
- DuckDuckGo - 10
- Qwant / SearXNG / Kagi / Brave - 4
- Startpage / Ecosia - 2
- Google - 1
As to my other questions around browsers:
- Majority seem to use Firefox
- Some mentions of Brave
- One mention of Arc
Just using duckduckgo. I’m not happy with my search results as they heavily prioritize clickbait CEO blogs instead of showing official documentation / sources.
Using qwant because it’s developed and hosted in France. Better than supporting a US company as a European.
But qwant uses bing
So you’re still supporting Microsoft
They use their own indexer.
I’ve had a pretty similar journey to yours and I’m currently using Qwant, although the only reason I’m using them is because they’re based in Europe and haven’t had any scandals that I could find. If you’re really concerned about privacy I’ve heard good things about Kagi.
They have a partnership with Microsoft. Not saying it’s a scandal but that was my reason to stop using it
I like the fact that brave search has an AI, that’s why I use it. I might self-host a foss search engine though
deleted by creator
Someone pointed me in the direction of these guides:
https://duckduckgo.com/duckduckgo-help-pages/results/syntax/
deleted by creator
I’m using an Ad-, Tracking- etc blocker in all my devices, so I’m not too worried about using Google or Bing when I do.
But I’m hosting my own instance of SearchXNG and that’s often simply the most powerful and flexible search engine.
Also self-host SearchXNG. Its definitely the way to go. I like you can choose which search engines to pull from
I replied to another comment on here saying that I’d tried this once before, via a Docker container, but just wasn’t getting any results back (kept getting timeouts from all the search engines).
I’ve just revisited it, and still get the timeouts. Reckon you’re able to help me troubleshoot it?
Below are the logs from Portainer:
File "/usr/local/searxng/searx/network/__init__.py", line 165, in get return request('get', url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/network/__init__.py", line 98, in request raise httpx.TimeoutException('Timeout', request=None) from e httpx.TimeoutException: Timeout 2023-08-06 09:58:13,651 ERROR:searx.engines.soundcloud: Fail to initialize Traceback (most recent call last): File "/usr/local/searxng/searx/network/__init__.py", line 96, in request return future.result(timeout) ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result raise TimeoutError() TimeoutError The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize self.engine.init(get_engine_from_settings(self.engine_name)) File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init guest_client_id = get_client_id() ^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id response = http_get("https://soundcloud.com") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/network/__init__.py", line 165, in get return request('get', url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/network/__init__.py", line 98, in request raise httpx.TimeoutException('Timeout', request=None) from e httpx.TimeoutException: Timeout 2023-08-06 09:58:13,654 ERROR:searx.engines.soundcloud: Fail to initialize Traceback (most recent call last): File "/usr/local/searxng/searx/network/__init__.py", line 96, in request return future.result(timeout) ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result raise TimeoutError() TimeoutError The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize self.engine.init(get_engine_from_settings(self.engine_name)) File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init guest_client_id = get_client_id() ^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id response = http_get("https://soundcloud.com") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/network/__init__.py", line 165, in get return request('get', url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/searxng/searx/network/__init__.py", line 98, in request raise httpx.TimeoutException('Timeout', request=None) from e httpx.TimeoutException: Timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.wikidata: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.duckduckgo: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.google: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.qwant: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.startpage: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.wikibooks: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.wikiquote: engine timeout 2023-08-06 10:02:05,024 ERROR:searx.engines.wikisource: engine timeout 2023-08-06 10:02:05,025 ERROR:searx.engines.wikipecies: engine timeout 2023-08-06 10:02:05,025 ERROR:searx.engines.wikiversity: engine timeout 2023-08-06 10:02:05,025 ERROR:searx.engines.wikivoyage: engine timeout 2023-08-06 10:02:05,025 ERROR:searx.engines.brave: engine timeout 2023-08-06 10:02:05,481 WARNING:searx.engines.wikidata: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,481 ERROR:searx.engines.wikidata: HTTP requests timeout (search duration : 6.457878380082548 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,482 WARNING:searx.engines.wikisource: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,484 ERROR:searx.engines.wikisource: HTTP requests timeout (search duration : 6.460748491808772 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,485 WARNING:searx.engines.brave: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,485 ERROR:searx.engines.brave: HTTP requests timeout (search duration : 6.461546086706221 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,487 WARNING:searx.engines.google: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,487 ERROR:searx.engines.google: HTTP requests timeout (search duration : 6.463769535068423 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,489 WARNING:searx.engines.wikiversity: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,489 ERROR:searx.engines.wikiversity: HTTP requests timeout (search duration : 6.466003180015832 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,490 WARNING:searx.engines.wikivoyage: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,490 ERROR:searx.engines.wikivoyage: HTTP requests timeout (search duration : 6.466597221791744 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,490 WARNING:searx.engines.qwant: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,490 ERROR:searx.engines.qwant: HTTP requests timeout (search duration : 6.4669976509176195 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,491 WARNING:searx.engines.wikibooks: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,491 ERROR:searx.engines.wikibooks: HTTP requests timeout (search duration : 6.4674198678694665 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,491 WARNING:searx.engines.wikiquote: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,492 WARNING:searx.engines.wikipecies: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,492 ERROR:searx.engines.wikiquote: HTTP requests timeout (search duration : 6.468321242835373 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,492 ERROR:searx.engines.wikipecies: HTTP requests timeout (search duration : 6.468797960784286 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,496 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,497 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 6.47349306801334 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:02:05,511 WARNING:searx.engines.startpage: ErrorContext('searx/engines/startpage.py', 214, 'resp = get(get_sc_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:02:05,511 ERROR:searx.engines.startpage: HTTP requests timeout (search duration : 6.487425099126995 s, timeout: 6.0 s) : TimeoutException 2023-08-06 10:04:27,475 ERROR:searx.engines.duckduckgo: engine timeout 2023-08-06 10:04:27,770 WARNING:searx.engines.duckduckgo: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False 2023-08-06 10:04:27,771 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.2968566291965544 s, timeout: 3.0 s) : TimeoutException 2023-08-06 10:04:50,094 ERROR:searx.engines.duckduckgo: engine timeout 2023-08-06 10:04:50,187 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.ConnectTimeout', None, (None, None, 'duckduckgo.com')) False 2023-08-06 10:04:50,187 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.0933595369569957 s, timeout: 3.0 s) : ConnectTimeout
The above is a simple search for “best privacy focused search engines 2023”, followed by the same search again but using the ddg! bang in front of it.
I can post my docker-compose if it helps?
First thing that comes to mind is are you running it on Host Network? That’s a requirement
Mullvad Leta
For search engine, I go with SearXNG, and for web browser, Mull. Or hardened Firefox, on PC.
Do you use a self-hosted SearXNG, or one of the other hosted instances?
Self-hosted under a VPN would be the way to go if you want to be 100% sure.
The problem with public instances is that you can’t really know what the owner does with the data. There are safe ones, and malicious ones. I’d just look for one that has a good reputation.
Yeah I have a self-hosted one but I’m struggling to get results. I posted under another comment on this thread, was just gonna ask for some support troubleshooting.
Completely agree though, self hosting over public instances all day long.
DDG cause the bangs are nice, but I find myself searching generally very little these days. I usually just use a bang to search a site I know I will find what I want on, if no bang then I will just navigate to that site. Search results have been shit for over a decade.
I use Firefox as main browser, but I discovered for my use case google provides the best results without needing to setup every workstation e.g. 2 home PCs, 1 mobile, 2 for work. And that I need to use all main 3 browsers. Also google provide good service with functions to quickly make currency conversions, simple math ekvations etc which even Bing is far behind.
“AI” services will change this but for now it’s too slow.
But in general for me, I have given up that fact to try stay private many years back, it’s all a dream just like living off grid, 99% of would not survive 4 days.
But the information can be scrambled, ie shift user accounts, services, software etc. It would also provide better competition due to the userbase is moving around… But most of us are too lazy or afraid to lose history, backups, photos etc. Just see how many that can’t just delete an old reddit account due to the time spent to reach an level you aren’t ready to leave. To lead to famous qoute I follow online
“Don’t let yourself get attached to anything you are not willing to walk out on in 30 seconds flat if you feel the heat around the corner”
Its all about supporting the services you like and are trying to be an counterweight to the other common commercial services… Meaning we need to found/pay for good services, privacy is a luxary looking on the whole user base.
Companies, I based on an idea, but exist to make someone money and if it’s tracks it will make many people money and in the end majority will lead the company to earn money and leave the base idea behind.
Does anyone have experience/thoughts about Mojeek?
I just recently learned about it and haven’t really explored it, but curious.
Thanks!
The concept is nice but the search results are less than ideal.
Been using Qwant for a few years. Good enough for 90% of searches imo. For whatever’s left, I’ll use DDG, Google in incognito, or Bing.