You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've recently used an OpenSearch description files to add custom search engines to the Firefox browser. using a OpenSearch XML descriptor file allows any website that has a search page actionable with a GET request to become a custom search engine on Firefox.
This also applies also to websites that do not provide an OpenSearch description XML.
Preparing this XML is easy, the basic workflow I follow is:
create 2 files:
a bare HTML file with the required rel="search" pointing to ...
... the OpenSearch XML descriptor in which the Url -> template attribute will be used to perform searches
pull up on localhost a web server serving these two files
point my browser to localhost and add the custom search engine pointing to an external entity search
Now, in my opinion there's a problem here: the OpenSearch specification allows the browser to not check that the Url -> template parameter is really matching the domain where the files described in point (1) are being served. From localhost I can add a search engine pointing to https://www.newsearchengine.com/search?q={searchTerms}.
This opens scenarios in which a website can point the user to a search page serving malicious content back to the query performed by the user's browser.
Example: www.innocent_site.com has been compromised and the main page is being served with the following injected snippet:
Please note the domain in the href tag is slightly different (a dash instead of an underscore). That XML file can then point the user to anywhere, allowing malicious content download, ransomware or pointing to a phishing site.
Users really are not aware of what is happening because they have no evidence of the visited website in the URL bar.
I've prepared an simple (and innocuous) proof at this url: the XML uses the search engine of the Medium platform, advertised by their OpenSearch descriptor XML.
There should be a way to force a match check of at least the three URLs involved:
the domain that serves the OpenSearch decription XML
the domain in the href attribute of the OpenSearch decription link tag
the domain pointed by the 'Url -> template" attribute in the XML file
If any of these three domains is not matching, the browser should raise a warning.
Mitigating factors:
the user must be persuaded to add the website to the their own custom search engine list
this would only affect only Firefox users
however, the OpenSearch XML can be used for orher purposes, the custom search feature is just a tiny part of it.
Opinions?
The text was updated successfully, but these errors were encountered:
Hello,
I've recently used an OpenSearch description files to add custom search engines to the Firefox browser. using a OpenSearch XML descriptor file allows any website that has a search page actionable with a
GET
request to become a custom search engine on Firefox.This also applies also to websites that do not provide an OpenSearch description XML.
Preparing this XML is easy, the basic workflow I follow is:
rel="search"
pointing to ...localhost
a web server serving these two fileslocalhost
and add the custom search engine pointing to an external entity searchNow, in my opinion there's a problem here: the OpenSearch specification allows the browser to not check that the Url -> template parameter is really matching the domain where the files described in point (1) are being served. From
localhost
I can add a search engine pointing tohttps://www.newsearchengine.com/search?q={searchTerms}
.This opens scenarios in which a website can point the user to a search page serving malicious content back to the query performed by the user's browser.
Example:
www.innocent_site.com
has been compromised and the main page is being served with the following injected snippet:Please note the domain in the
href
tag is slightly different (a dash instead of an underscore). That XML file can then point the user to anywhere, allowing malicious content download, ransomware or pointing to a phishing site.Users really are not aware of what is happening because they have no evidence of the visited website in the URL bar.
I've prepared an simple (and innocuous) proof at this url: the XML uses the search engine of the Medium platform, advertised by their OpenSearch descriptor XML.
There should be a way to force a match check of at least the three URLs involved:
href
attribute of the OpenSearch decriptionlink
tagIf any of these three domains is not matching, the browser should raise a warning.
Mitigating factors:
however, the OpenSearch XML can be used for orher purposes, the custom search feature is just a tiny part of it.
Opinions?
The text was updated successfully, but these errors were encountered: