Great info. I'd like to use NLM in my work, but can't give it or provide access to sensitive company information. Any suggestions on how to utilize this tool interally? We have access to GCP. Maybe develop a similar workflow within this environment?
In the spirit of sharing new ideas on how this can be used... I have been posting the output of deep research queries from chatGPT into NotebookLM as a source to absorb the complexity that comes out of deep research questions in an interesting way. One ? was on the state of the art for using random forests in diagnosing diseases where I can ask for the most successful applications and ask for new ideas on how to make the outcomes even better. Interesting.
I had made a mobile app to do this for YouTube to help me get summaries and the next step is synthesize things. I was building for myself but hoped to monetize my time.
Seems like even niche opportunities are going to be overtaken by big tech in the space.
That's great to hear Zach, check out Deep Research too. Deep Research for me has a fairly high ceiling for 2025. i.e. AI Agents that do research for us developing us reports. I'm interesting in how people are using Deep Research, Learn About and NotebookLM together + of course Perplexity.
Currently I’m using Perplexity - pro search has been really powerful in driving my research. Will try out Deep Research sometime soon (too many subscriptions atm.)
NLM really was my fav 2024 tool, it takes more time to get used to it, but the results are impressive. I launched also one of my podcast episodes, to ask the community which are their feelings when they know it is AI generates? Here the link, open to feedback.
Aren’t you feeding this tool with information the same way you feed ChatGPT every time you use it? I read that people use it by inputting business documents etc…but isn’t it dangerous to feed it with confidential information for example? Even if it helps you be more productive/efficient professionally.
The whole concept is really out of this world…thanks for everything you do👏
What is the amount of knowledge retention using a tool like this? Sure, if you just need to write a report once and not interested in learning long-term about the subject, is using a tool like this going to help or will you just forget everything the next day? Would like to know from people using the tool regularly.
“It wont make things up…”? Hardly. I was rustic when I first used it about 6 weeks ago, asking it to review and summarize a lengthy, data-rich government report for me. The podcast mentioned a handful of numbers, one of which was totally hallucinated, existing nowhere in the report (and false, as far as I could tell from additional research). This tech is not fully reliable. User guides shouldn’t pretend that they are.
No pretending, just based on experience. I’ve never caught it pulling in outside data or hallucinating. But it’s good to hear other user experiences so we can get a better understanding. Most of these tools are black boxes and with few users in the bigger picture.
Users should always test and experiment on their own and report back. This helps us get a better look into the limitations of such tools.
Played around with Notebook LM first time last night because of this post. Tried summarizing 40ish annual reports across many industries. It's like the opposite of Chat GPT - no assumptions at all to the point that you really need to be specific about what you want to see
In case anyone is curious or has thoughts: file:///Users/vedshankar/Documents/Books/Notebook%20LM%20experiment/Industry%20Strategy%20Deep%20Dives.pdf
Another workflow example for Google NotebookLM is to utilize the podcast summary feature, as explained so well in this post, download and then reupload the audio BACK into Google Notebook to create a new source (or Note) for reference. I've done this for some of the articles I've had to research. Helpful.
Great info. I'd like to use NLM in my work, but can't give it or provide access to sensitive company information. Any suggestions on how to utilize this tool interally? We have access to GCP. Maybe develop a similar workflow within this environment?
Really good quality post for newbie here!
In the spirit of sharing new ideas on how this can be used... I have been posting the output of deep research queries from chatGPT into NotebookLM as a source to absorb the complexity that comes out of deep research questions in an interesting way. One ? was on the state of the art for using random forests in diagnosing diseases where I can ask for the most successful applications and ask for new ideas on how to make the outcomes even better. Interesting.
This is a super duper article. Really enjoyed it. Thanks for all the info!!!
Nice one thanks Michael
Write what your soul is speaking and whoever has the same frequency will respond back. The Law of reciprocity.
Share your tips on how you use it
I had made a mobile app to do this for YouTube to help me get summaries and the next step is synthesize things. I was building for myself but hoped to monetize my time.
Seems like even niche opportunities are going to be overtaken by big tech in the space.
Thank you! Very great and useful starting point to get NotebookLM in own proper workflow.
Thank you. This made me jump into NotebookLM right away
That's great to hear Zach, check out Deep Research too. Deep Research for me has a fairly high ceiling for 2025. i.e. AI Agents that do research for us developing us reports. I'm interesting in how people are using Deep Research, Learn About and NotebookLM together + of course Perplexity.
Currently I’m using Perplexity - pro search has been really powerful in driving my research. Will try out Deep Research sometime soon (too many subscriptions atm.)
NLM really was my fav 2024 tool, it takes more time to get used to it, but the results are impressive. I launched also one of my podcast episodes, to ask the community which are their feelings when they know it is AI generates? Here the link, open to feedback.
https://open.spotify.com/episode/0ov7UrpAjvDyjwnvMA59lJ?si=AOQZGndXTKye1oPl3OvU1g&t=165
Aren’t you feeding this tool with information the same way you feed ChatGPT every time you use it? I read that people use it by inputting business documents etc…but isn’t it dangerous to feed it with confidential information for example? Even if it helps you be more productive/efficient professionally.
The whole concept is really out of this world…thanks for everything you do👏
What is the amount of knowledge retention using a tool like this? Sure, if you just need to write a report once and not interested in learning long-term about the subject, is using a tool like this going to help or will you just forget everything the next day? Would like to know from people using the tool regularly.
“It wont make things up…”? Hardly. I was rustic when I first used it about 6 weeks ago, asking it to review and summarize a lengthy, data-rich government report for me. The podcast mentioned a handful of numbers, one of which was totally hallucinated, existing nowhere in the report (and false, as far as I could tell from additional research). This tech is not fully reliable. User guides shouldn’t pretend that they are.
No pretending, just based on experience. I’ve never caught it pulling in outside data or hallucinating. But it’s good to hear other user experiences so we can get a better understanding. Most of these tools are black boxes and with few users in the bigger picture.
Users should always test and experiment on their own and report back. This helps us get a better look into the limitations of such tools.
I was “optimistic” not “rustic.” I’m not reliable either apparently.
Played around with Notebook LM first time last night because of this post. Tried summarizing 40ish annual reports across many industries. It's like the opposite of Chat GPT - no assumptions at all to the point that you really need to be specific about what you want to see
In case anyone is curious or has thoughts: file:///Users/vedshankar/Documents/Books/Notebook%20LM%20experiment/Industry%20Strategy%20Deep%20Dives.pdf
Another workflow example for Google NotebookLM is to utilize the podcast summary feature, as explained so well in this post, download and then reupload the audio BACK into Google Notebook to create a new source (or Note) for reference. I've done this for some of the articles I've had to research. Helpful.