You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The server makes the bad assumption that the remote repository actively wants to be perceived by the mcp protocol, and has not opted out.
To be GDPR compatible, a opt in mechanism would be best.
Robots.txt Is not a good solution because that it requires that people inform themselves and then opt out based on a user name or user agent that can change or simply be ignored.
Affected version
All versions.
Steps to reproduce the behavior
Use the product on a repo created by somebody.
The bad assumption was made to think that "if it's publicly available it must be available for model training or use by AI" which is not correct. Plenty of source available licenses exists but they don't allow that kind of thing. There is also the noai versions of open source licenses that should be respected.
Expected vs actual behavior
The real question that needs to be asked here is about user consent and about making sure that people are allowed to opt in, but are not considered to be opted-in without their informed consent first.
I would expect such a system to get informed consent before being allowed to connect to possibly proprietary or even dangerous remote system.
I think the biggest problem here is that if somebody's license doesn't allow it, then MCP access should be disabled/turned off/not possible.
Logs
Not needed.
The text was updated successfully, but these errors were encountered:
Describe the bug
The server makes the bad assumption that the remote repository actively wants to be perceived by the mcp protocol, and has not opted out.
To be GDPR compatible, a opt in mechanism would be best.
Robots.txt Is not a good solution because that it requires that people inform themselves and then opt out based on a user name or user agent that can change or simply be ignored.
Affected version
All versions.
Steps to reproduce the behavior
Use the product on a repo created by somebody.
The bad assumption was made to think that "if it's publicly available it must be available for model training or use by AI" which is not correct. Plenty of source available licenses exists but they don't allow that kind of thing. There is also the noai versions of open source licenses that should be respected.
Expected vs actual behavior
The real question that needs to be asked here is about user consent and about making sure that people are allowed to opt in, but are not considered to be opted-in without their informed consent first.
I would expect such a system to get informed consent before being allowed to connect to possibly proprietary or even dangerous remote system.
I think the biggest problem here is that if somebody's license doesn't allow it, then MCP access should be disabled/turned off/not possible.
Logs
Not needed.
The text was updated successfully, but these errors were encountered: