Hello community how are you?
I am currently new with robots.txt files and SEO improvement, I am currently supporting an ASP.Net application and I am in the task of improving this feature of SEO positioning.
In what I was revisano with the lighthouse tool tells me that it will create a robots.txt file and add it to my project to improve the positioning, so I dedicated myself to that task and configure and create that file. What happens is that when I look at the lighthouse again, the SEO score has indeed gone up, but I keep getting robots.txt errors which tell me that it is because of my html, which doesn't make much sense to me. Next I show you my robots.txt file and the error that appears in lighthouse.
robots.txt file:
user-agent:
Disallow: /.aspx
Disallow: /_controls/
Disallow: /_master/
Disallow: /Account/
Disallow: /Admin/
Disallow: /Error/
Disallow: /Login/
Disallow: /Public/
User-agent: Googlebot
User-agent: AdsBot-Google
Disallow: /*.aspx
Disallow: /_controls/
Disallow: /_master/
Disallow: /Account/
Disallow: /Admin/
Disallow: /Error/
Disallow: /Login/
Disallow: /Public/
Browser errors

Thank you very much for your help