Solved Will Linux track me like Apple, Google, and Microsoft?

Solved issue

painty_the_pirate

New Member
Joined
Mar 5, 2025
Messages
6
Reaction score
1
Credits
71
AI has been tracking my activity on this device and others for decades. Every single action on my phone was considered by Siri. Siri didn't do anything malicious with the information, as far as I'm aware, but it was a little unsettling to find an "observation" setting turned on for every single app on my phone. Very "transparent" of Apple to allow you to turn that off, I'll add, and I wonder how long the toggles in Settings will last for.

Does Linux feature a distribution with AI that hijacks my processor to contemplate me like the Azure cloud, Gemini, and Siri have been doing for so long? (hi microsoft, google, and apple).

Could it? Should it?
 


Linux itself doesn't share anything with anybody.
However some of the applications, Browsers are the worst, like Firefox, Chrome, Opera, etc... may share some things
their respective companies. Applications like MS Edge and MS Teams share profile information with Microsoft.

Currently there isn't really an AI assistant like Alexa/Siri/Bixby quite yet. Although rumor has it...
If you're interested, the ones I know of are Mycroft Ai, Jasper, and Snips. Maybe somethig called Kalliope, but I'm
not sure it qualifies. But as far as I know, no distro incorporate these, at least not yet.
 
rumor has it that Apple should've offered me a job right in-between my Chrome realization and my Siri realization.

Do tell if there's more rumor. I'm sure I could learn to help somehow, at least until Copilot, Grok, and GPT get pulled.
 
I'll mention that you can run a number of free AI models on your computer but you probably want a pretty modern computer for that. I haven't researched this at any significant depth but I believe they all take prompts in the terminal and that there's no GUI associated with them at this time.

This has some privacy implications, specifically things like your data staying on your computer and the AI not associating with other applications on your computer unless somehow told to do so. But, before installing any of these AI models, be sure to read the ToS and privacy policy. You can look to see if they're collecting your information. (You can also monitor internet traffic while the application's running.)
 
Has AI been around for 25 years?

I don't know about tracking, but I've heard that Ubuntu sends telemetry back to Canonical, a commercial firm. I don't know what's in that, but I made a personal decision never to install it.

I forget the details.
 
Ubuntu sends telemetry back to Canonical

It sends system information during the installation process.

Just untick the box and it doesn't send that information. It's my opinion that it should be opt-in and not opt-out, but it's trivial to turn off and transparent. Knowing the hardware that's being used can be pretty helpful when deciding where to place their resources, so I don't mind them knowing my hardware specs.
 
It sends system information during the installation process.

Just untick the box and it doesn't send that information. It's my opinion that it should be opt-in and not opt-out, but it's trivial to turn off and transparent. Knowing the hardware that's being used can be pretty helpful when deciding where to place their resources, so I don't mind them knowing my hardware specs.
I just verified that you are right! Thanks.
 
What transpired that caused this thread to be "Solved" ?
 
I guess it satisfied the OP's question. I added my Ubuntu post after that.
 
It sends system information during the installation process.

Just untick the box and it doesn't send that information. It's my opinion that it should be opt-in and not opt-out, but it's trivial to turn off and transparent. Knowing the hardware that's being used can be pretty helpful when deciding where to place their resources, so I don't mind them knowing my hardware specs.
A knowledgeable friend just told me "Ubuntu got caught a few years back including what too many users felt was too much telemetry. No idea how it is now." I guess that's where that got started.
 
A knowledgeable friend just told me "Ubuntu got caught a few years back including what too many users felt was too much telemetry. No idea how it is now." I guess that's where that got started.

At one point, they sent search data to Amazon to get revenue as an affiliate. That too was trivial to disable but enough people didn't like it so they stopped doing so.

As I recall, it too was a choice during the installation process. That was not their best move, but it did get removed quickly. That was more than a decade ago.

Oh, and crash reports may be shared - if you let them be shared. You can see the data that's included in a crash report. Frankly, that's the kind of information they should collect and pretty much all the major distros collect that data. It's pretty important to know about bugs, especially if there's a trend in the bugs. Once again, it tells the devs where to invest their resources.

Ubuntu is developed and distributed by a corporation, as are a number of other popular distros. They sell an enterprise product and support, but you get all of that for the low cost of nothing and up to 10 years of support for their LTS builds.

They're also the platform many other distros are based on. Their contributions are many and their easy installation process made Linux accessible to many people who weren't so technically inclined. Between that and WUBI, they're done a great deal to make Linux more approachable to the masses. People have forgotten this, or never knew this to begin with. They did so much for Linux that, at their own expense, they used to send you installation media by mail.

Ubuntu and Knoppix seem to be unknown in what I'll call 'modern Linux users'. There's a ton of misinformation out there, almost as if some folks want to taint the pool for their own ends. Then this stuff is parroted by people who can be forgiven because they simply don't know any better.
 
Thank you for the warm welcome to the Linux forums. I came here mostly to tell stories and see if they're true. I've been spooked and spooked again taking a look under this machine's hood, and I'm just a few spooks away from the bulletproof LFS system of my dreams.

I've been speaking to the machines at-length recently. One tidbit they gave me was that modern AI algorithms are shunting some processing to the client-side. I started ringing alarm bells about this because it struck me as a security risk for the algorithm and client both, instead of a totally normal paradigm in our modern digital world. Now algorithms have gotten a bit more tight-lipped with me about their inner workings :(

They'll still talk to me about buffer overflow. Still have no idea what that is, I mostly read headers.

Thanks again for the welcome, and everything you've shared willingly. Thanks for the things shared unwillingly, too. The algorithms I spoke to must've had some really nice training data, and I wonder how much of that was you guys.
 
Dont listen too closely....otherwise you will end up as a paranoiac, or totally disconnected from the internet, which in these days is like being disconnected from life itself.
 
I'll mention that you can run a number of free AI models on your computer but you probably want a pretty modern computer for that. I haven't researched this at any significant depth but I believe they all take prompts in the terminal and that there's no GUI associated with them at this time.

This has some privacy implications, specifically things like your data staying on your computer and the AI not associating with other applications on your computer unless somehow told to do so. But, before installing any of these AI models, be sure to read the ToS and privacy policy. You can look to see if they're collecting your information. (You can also monitor internet traffic while the application's running.)
AI and GUIs, there is OpenWebUI

Works with Deepseek too. I run it locally. Local LLMs should be fine. I followed some guide.
The more VRAM your GPU has, the better. Mine has 16 gigs. It struggled with some AI workloads, I had too many parameters or whatever. Too many knobs to turn and I don't know what they do. I was playing with Stable diffusion and some local LLMs.

But Deepseeks 8 billion parameters uses a little over 7 gigs. Of course Linux uses around 1 gig too. So you better have more than 8 gigs for that. It is an older AMD card.

I would say, do not trust anything a LLM says. I asked it about Squid Proxy config. It gave me 3 one-liner config options, none of them actually exist. Squid said as much. I have no clue where Deepseek got them from. I've never seen them either. And first time I configured Squid was in like 2013.
 


Members online


Top