Comment by brandonb

Comment by brandonb 4 days ago

22 replies

I worked on one of the first wearable foundation models in 2018. The innovation of this 2025 paper from Apple is moving up to a higher level of abstraction: instead of training on raw sensor data (PPG, accelerometer), it trains on a timeseries of behavioral biomarkers derived from that data (e.g., HRV, resting heart rate, and so on.).

They find high accuracy in detecting many conditions: diabetes (83%), heart failure (90%), sleep apnea (85%), etc.

teiferer 3 days ago

What is an "accuracy" of 83%? Do 83% of predicted diabetes cases actually have diabetes? Or did 83% of those who have diabetes get diagnosed as such? It's about precision vs. recall. You can improve one by sacrificing the other. Boiling it down to one number is hard.

crorella 3 days ago

Insurance and health insurance companies must be super interested in this research and its applications.

  • jeron 3 days ago

    I'm sure they're also interested in the data. Imagine raising premiums based on conditions they detect from your wearables. That's why it's of utmost importance to secure biometrics data

    • brandonb 3 days ago

      At least in the US, health insurers can’t raise rates or deny coverage based on pre-existing conditions. That was a major part of the Affordable Care Act.

      • abenga 3 days ago

        The ACA will not survive the next couple of years.

    • apwell23 2 days ago

      how would that work. i pay flat rate through my employer.

  • autoexec 3 days ago

    There are so many companies across many industries who are salivating at the thought of everyone using wearables to monitor their "health" and getting their hands on that data. Including law enforcement, lawyers, and other government agencies.

    • teiferer 3 days ago

      It's industry leaders that are salivating the most.

throwaway314155 4 days ago

Had the phrase "foundation model" become a term of art yet?

  • brandonb 4 days ago

    By 2018, the concept was definitely in the air since you had GPT-1 (2018) and BERT (2018). You could argue even Word2Vec (2013) had the core concept of pre-training on an unsupervised or self-supervised objective leading to performance on a downstream semantic task. However, the phrase "foundation model" wasn't coined until 2021, to my knowledge.

    • throwaway314155 2 days ago

      I guess I just find the whole "foundation model" phrasing to be designed in a way to pat the backs of the "winners" who would of course be those with the most money. I'm sure there are foundation models from groups that aren't e.g. OpenAI, but the origins felt egotistical and asserting that you made one prior to the phrase's inception only feels more-so.

      Had you merely called it an early instance of pretraining, I'd be fine with it.

puppymaster 4 days ago

reminds me of Jim Simons of Renaissance advise when it comes to data science - sort first, then regress.