Community Discussions and Support
URL/Hyperlink Blacklist

As the problem has been resolved by a reconfiguration, I propose to not do anything in Bearhtml to support suffixed DNS strings for now.  If the problem becomes more widespread etc I will implement a user option to support the suffix

Martin

<p>As the problem has been resolved by a reconfiguration, I propose to not do anything in Bearhtml to support suffixed DNS strings for now.  If the problem becomes more widespread etc I will implement a user option to support the suffix</p><p>Martin </p>

Recently, Pegasus started issuing the "URL Blacklist Alert !" for some very improbable websites.  Among these, for example is The United Parcel Service website (starting with http://wwwapps.ups.com). Paying close attention, I see that a message "Checking Blacklist" flashes on the status bar, then the message "Blacklisted: ups.com".

I know that I can turn this warning off by unchecking the 'Check for suspicious "phishing" URLS...' option, but it seems like a useful tool if I can configure it to stop warning me about sites that absolutely are NOT "known to promote SPAM sales".

I have searched high and low, and I can find no way to remove websites that I *know* aren't worthy of blacklisting. How did these websites get blacklisted in the first place?  Where is the blacklist?  How do I correct a known error?

Thank you.

<p>Recently, Pegasus started issuing the "URL Blacklist Alert !" for some very improbable websites.  Among these, for example is The United Parcel Service website (starting with http://wwwapps.ups.com). Paying close attention, I see that a message "Checking Blacklist" flashes on the status bar, then the message "Blacklisted: ups.com". </p><p>I know that I can turn this warning off by unchecking the 'Check for suspicious "phishing" URLS...' option, but it seems like a useful tool if I can configure it to stop warning me about sites that absolutely are NOT "known to promote SPAM sales". </p><p>I have searched high and low, and I can find no way to remove websites that I *know* aren't worthy of blacklisting. How did these websites get blacklisted in the first place?  Where is the blacklist?  How do I correct a known error?</p><p>Thank you.</p>

To edit the blacklist you can press F6 to open a window containing the black- and white-lists.

There you can edit the entries of the lists.

I hope that helps you.

 

 

<P>To edit the blacklist you can press F6 to open a window containing the black- and white-lists.</P><P>There you can edit the entries of the lists.</P><P>I hope that helps you.</P><P> </P><P> </P>

Thanks for the reply, but I don't think that's it.  On my system, the blacklist and whitelist (BLACK.PML and WHITE.PML, respectively) are both totally empty, and I think that mechanism provides a list of domains from which emails will be rejected or perhaps marked as spam. 

What I'm trying to find is the blacklist against which URLs in emails are compared when you click on them.  I get tracking numbers from UPS via email, and Pegasus warns me that the "ups.com" domain that I'm trying to surf to is "known to promote SPAM sales".  Pegasus doesn't prevent me from going there, it just presents an "OK/Cancel" dialog every time I click on a tracking URL from UPS.

And for the record, it isn't just UPS; there are other such sites that I would never have classified as spam sites, that Pegasus warns me against.  Neither does Pegasus offer me an option to except the site.

<P>Thanks for the reply, but I don't think that's it.  On my system, the blacklist and whitelist (BLACK.PML and WHITE.PML, respectively) are both totally empty, and I think <EM>that</EM> mechanism provides a list of domains from which emails will be rejected or perhaps marked as spam.  </P> <P>What I'm trying to find is the blacklist against which URLs in emails are compared when you click on them.  I get tracking numbers from UPS via email, and Pegasus warns me that the "ups.com" domain that I'm trying to surf to is "known to promote SPAM sales".  Pegasus doesn't prevent me from going there, it just presents an "OK/Cancel" dialog every time I click on a tracking URL from UPS.</P> <P>And for the record, it isn't just UPS; there are other such sites that I would never have classified as spam sites, that Pegasus warns me against.  Neither does Pegasus offer me an option to except the site.</P>

IIRC, these are tested against http://www.surbl.org/ and the removal instructions are listed at that site.  I'm still running v2.7.2 of bear so that might not be correct.

 

<p>IIRC, these are tested against http://www.surbl.org/ and the removal instructions are listed at that site.  I'm still running v2.7.2 of bear so that might not be correct.</p><p> </p>

UPS.COM is not currently blacklisted.   

Martin

<p>UPS.COM is not currently blacklisted.   </p><p>Martin </p>

I forgot to mention in my previous posting that Bearhtml.log will contain an entry that itemises the Url checking sevice that reports the URL as blacklisted.

Use Bearhtml.hlp to lookup the last number in the IP address returned by the blacklist service.

Martin

<p>I forgot to mention in my previous posting that Bearhtml.log will contain an entry that itemises the Url checking sevice that reports the URL as blacklisted.</p><p>Use Bearhtml.hlp to lookup the last number in the IP address returned by the blacklist service.</p><p>Martin </p>

Shame on me for not having mentioned this up front, but I'm using Pegasus v4.41.

Martin - I have never followed the bearhtml installation instructions, and there is no bearhtml.log file anywhere on my system. Is bearhtml even running?  Would there be any explanations for what I'm seeing in this case?

1. I just now installed per the instructions, set WantBl=No (it was Yes?!) in Bearhtml.ini, and restarted.  This keeps Pegasus from prompting me about ups.com when I track my package, but I think I've then thrown the baby out with the bathwater if I've shut off all blacklist checking.  Here is the log:

Bearhtml version 2.7.6

Remote cache empty

User directory: \\FOGHORN\Shared\MAILBO~1\jgo

Pegasus Mail User: jgo

Pegasus Mail directory: C:\PROGRA~1\Pegasus

AutoConfig Url: [none]

Proxy Server: Inactive

Bearhtml Registry processing completed

Tidy error log:

Tidy configFile: C:\PROGRA~1\Pegasus\beartidy.cfg

Blacklist checking disabled

IE cache-ing disabled

Cache-ing enabled

UTF8 charset checking enabled

Bearhtml Ini processing completed

Loaded BearWarn.txt

UT: Input charset: iso-8859-1 changed to: CP1252

VC Tidy scorecard: 0 errors & 34 warnings

2. I set WantBl=Yes in Bearhtml.ini, and the behavior returns (ups.com *is* apparently blacklisted for me).  Here is the log:

Bearhtml version 2.7.6
Remote cache empty
User directory: \\FOGHORN\Shared\MAILBO~1\jgo
Pegasus Mail User: jgo
Pegasus Mail directory: C:\PROGRA~1\Pegasus
AutoConfig Url: [none]
Proxy Server: Inactive
Bearhtml Registry processing completed
Tidy error log:
Tidy configFile: C:\PROGRA~1\Pegasus\beartidy.cfg
Blacklist lookup requested
IE cache-ing disabled
Cache-ing enabled
UTF8 charset checking enabled
Bearhtml Ini processing completed
Loaded BearWarn.txt
UT: Input charset: iso-8859-1 changed to: CP1252
VC Tidy scorecard: 0 errors & 34 warnings
Blacklisted: ups.com

I see that blacklist lookup is requested, and that ups.com is blacklisted, but I see no reference to any Url checking service. 

Also, could you rephrase your suggestion about using bearhtml.hlp?  I'm definitely not following that train of thought very well.

What could cause a false positive, if I'm not using spamassassin?  Can you give me a URL to hit that you know *is* on the blacklist?  What would you try next, in my position?

<p>Shame on me for not having mentioned this up front, but I'm using Pegasus v4.41.</p><p>Martin - I have never followed the bearhtml installation instructions, and there is no bearhtml.log file anywhere on my system. Is bearhtml even running?  Would there be any explanations for what I'm seeing in this case? </p><p>1. I just now installed per the instructions, set WantBl=No (it was Yes?!) in Bearhtml.ini, and restarted.  This keeps Pegasus from prompting me about ups.com when I track my package, but I think I've then thrown the baby out with the bathwater if I've shut off all blacklist checking.  Here is the log: </p><blockquote><p>Bearhtml version 2.7.6 Remote cache empty User directory: \\FOGHORN\Shared\MAILBO~1\jgo Pegasus Mail User: jgo Pegasus Mail directory: C:\PROGRA~1\Pegasus AutoConfig Url: [none] Proxy Server: Inactive Bearhtml Registry processing completed Tidy error log: Tidy configFile: C:\PROGRA~1\Pegasus\beartidy.cfg Blacklist checking disabled IE cache-ing disabled Cache-ing enabled UTF8 charset checking enabled Bearhtml Ini processing completed Loaded BearWarn.txt UT: Input charset: iso-8859-1 changed to: CP1252 VC Tidy scorecard: 0 errors & 34 warnings </p></blockquote><p>2. I set WantBl=Yes in Bearhtml.ini, and the behavior returns (ups.com *is* apparently blacklisted for me).  Here is the log:</p><blockquote>Bearhtml version 2.7.6 Remote cache empty User directory: \\FOGHORN\Shared\MAILBO~1\jgo Pegasus Mail User: jgo Pegasus Mail directory: C:\PROGRA~1\Pegasus AutoConfig Url: [none] Proxy Server: Inactive Bearhtml Registry processing completed Tidy error log: Tidy configFile: C:\PROGRA~1\Pegasus\beartidy.cfg Blacklist lookup requested IE cache-ing disabled Cache-ing enabled UTF8 charset checking enabled Bearhtml Ini processing completed Loaded BearWarn.txt UT: Input charset: iso-8859-1 changed to: CP1252 VC Tidy scorecard: 0 errors & 34 warnings Blacklisted: ups.com </blockquote><p>I see that blacklist lookup is requested, and that ups.com is blacklisted, but I see no reference to any Url checking service.  </p><p>Also, could you rephrase your suggestion about using bearhtml.hlp?  I'm definitely not following that train of thought very well. </p><p>What could cause a false positive, if I'm not using spamassassin?  Can you give me a URL to hit that you know *is* on the blacklist?  What would you try next, in my position?</p>

Firstly setup a Bearhtml.log, using Notepad, in your newmail directory (eg: \\FOGHORN\Shared\MAILBO~1\jgo)

 Then retry your UPS message. Now check Bearhtml.log.  There should be a message like:

"Blacklisted: ups.com as 127.0.0.n "  where n indicates which service number is reporting the problem.   See Bearhlp-en.htm section "Suspicious Html" where it lists the possible services that report on suspicious Urls.

 Secondly please send me a copy of the message so I can see what you are seeing.

 Martin

<p>Firstly setup a Bearhtml.log, using Notepad, in your newmail directory (eg: \\FOGHORN\Shared\MAILBO~1\jgo)</p><p> Then retry your UPS message. Now check Bearhtml.log.  There should be a message like:</p><p>"Blacklisted: ups.com as 127.0.0.n "  where n indicates which service number is reporting the problem.   See Bearhlp-en.htm section "Suspicious Html" where it lists the possible services that report on suspicious Urls.</p><p> Secondly please send me a copy of the message so I can see what you are seeing.</p><p> Martin </p>

New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain.

[quote user="irelam"]Firstly setup a Bearhtml.log, using Notepad, in your newmail directory (eg: \\FOGHORN\Shared\MAILBO~1\jgo) [/quote]

?? I thought it was pretty apparent that I'd already done that.  I posted logs, and they represent exactly what I'm seeing.

[quote user="irelam"]Then retry your UPS message. Now check Bearhtml.log.  There should be a message like:

"Blacklisted: ups.com as 127.0.0.n "  where n indicates which service number is reporting the problem.   See Bearhlp-en.htm section "Suspicious Html" where it lists the possible services that report on suspicious Urls. [/quote]

There isn't.  There is a message like the one I posted previously:

Blacklisted: ups.com

It ends there. It doesn't say "as" anything, or give me an IP address.  I think that's why your directions for using the help file were confusing for me at first.  I understand you now, though.

[quote user="irelam"] Secondly please send me a copy of the message so I can see what you are seeing. [/quote]

I'd have to reconfigure my Firefox to post images, so instead I'll just transcribe the entire content of the dialog I see.  It is an untitled dialog with OK and Cancel buttons, containing:

URL Blacklist Alert!

URL domain name is known to promote SPAM sales

http://wwwapps.ups.com/WebTracking/processRequest?HTMLVersion=5.0&Requester=NES&AgreeToTermsAndConditions=yes&loc=en_US&tracknum=1Z[omitted]

Are you sure this Web Page is safe?

Click the "OK " button to open this web page.

Thank you sincerely, for the time you're spending on this.

Grady

<p>New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain. </p><p>[quote user="irelam"]Firstly setup a Bearhtml.log, using Notepad, in your newmail directory (eg: \\FOGHORN\Shared\MAILBO~1\jgo) [/quote]</p><p>?? I thought it was pretty apparent that I'd already done that.  I posted logs, and they represent exactly what I'm seeing. </p><p>[quote user="irelam"]Then retry your UPS message. Now check Bearhtml.log.  There should be a message like:</p><p>"Blacklisted: ups.com as 127.0.0.n "  where n indicates which service number is reporting the problem.   See Bearhlp-en.htm section "Suspicious Html" where it lists the possible services that report on suspicious Urls. [/quote]</p><p>There isn't.  There is a message like the one I posted previously:</p><blockquote><p><b>Blacklisted: ups.com </b></p></blockquote><p>It ends there. It doesn't say "as" anything, or give me an IP address.  I think that's why your directions for using the help file were confusing for me at first.  I understand you now, though. </p><p>[quote user="irelam"] Secondly please send me a copy of the message so I can see what you are seeing. [/quote]</p><p>I'd have to reconfigure my Firefox to post images, so instead I'll just transcribe the entire content of the dialog I see.  It is an untitled dialog with OK and Cancel buttons, containing:</p><p><b>URL Blacklist Alert!</b></p><p><b>URL domain name is known to promote SPAM sales</b></p><p><b>http://wwwapps.ups.com/WebTracking/processRequest?HTMLVersion=5.0&Requester=NES&AgreeToTermsAndConditions=yes&loc=en_US&tracknum=1Z[omitted]</b></p><p><b>Are you sure this Web Page is safe?</b></p><p><b>Click the "OK " button to open this web page. </b></p><p>Thank you sincerely, for the time you're spending on this.</p><p>Grady </p>

New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain.

This makes me think that you are not getting the right answer from http://www.surbl.org/ since according to the BearHTML help it is querying this location for the data.  Here are some reasons for the false positives.  You are probably not using SpamAssassin but you might be using something that modifies how the DNS queries work.

DNS bugs and incompabilities leading to false positives

There is a bug (#3997) in versions of SpamAssassin older than 3.1

where the responses to DNS queries occasionally get mixed up,

resulting in very rare false positives (wanted mail tagged as unsolicited).

This can be seen when SpamAssassin shows a domain as blacklisted

but it is not blacklisted when checking with a

manual DNS query

or on the

lookup page.

The solution is to upgrade to SpamAssassin version 3.1 or later.

Another issue for some anti-spam or anti-phshing

DNS or proxy services that modify

the results of DNS queries is that some of those changes

may not compatible with SURBL applications. In particular,

modification of NXDOMAIN responses can result in false

positives due to the changed Address bits in the response.

But any modification of the DNS query results can

lead to application errors. The solution is to

not use DNS or

proxy services that modify query results

on your systems running SURBL applications.

Additionally some ISPs such as

Verizon and others

are now modifying some DNS NXDOMAIN responses

in a way that causes what look like false positives on domains

that are not blacklisted. They appear to be doing this to drive search

traffic to other sites, but unfortunately it breaks DNS responses

for SURBLs and other blacklists.

Please check with your ISP if you are seeing DNS responses modified

in this way. Verizon has an opt-out procedure with instructions on switching

to DNS servers that do not change NXDOMAIN responses.

Others such as Charter have opt-out nameservers that reportedly do not support

NXDOMAIN. If so, then none of their nameservers may be compatible.

One solution is to not use their nameservers.

<blockquote>New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain. </blockquote><p>This makes me think that you are not getting the right answer from http://www.surbl.org/ since according to the BearHTML help it is querying this location for the data.  Here are some reasons for the false positives.  You are probably not using SpamAssassin but you might be using something that modifies how the DNS queries work. </p><h2><font face="georgia, palatino, times"><a class="" name="dnsbug"></a>DNS bugs and incompabilities leading to false positives</font></h2> <font face="georgia, palatino, times">There is a bug (#3997) in versions of SpamAssassin older than 3.1 where the responses to DNS queries occasionally get mixed up, resulting in very rare false positives (wanted mail tagged as unsolicited). This can be seen when SpamAssassin shows a domain as blacklisted but it is not blacklisted when checking with a <a href="http://www.surbl.org/implementation.html">manual DNS query</a> or on the <a href="http://www.rulesemporium.com/cgi-bin/uribl.cgi">lookup page</a>. The solution is to upgrade to SpamAssassin version 3.1 or later. </font><p> <font face="georgia, palatino, times">Another issue for some anti-spam or anti-phshing <a href="http://www.surbl.org/faq.html#opendns">DNS or proxy services that modify the results of DNS queries</a> is that some of those changes may not compatible with SURBL applications. In particular, modification of NXDOMAIN responses can result in false positives due to the changed Address bits in the response. But any modification of the DNS query results can lead to application errors. The solution is to <a href="http://www.surbl.org/faq.html#opendns">not use DNS or proxy services that modify query results</a> on your systems running SURBL applications. </font></p><p> <font face="georgia, palatino, times">Additionally some ISPs such as <a href="http://www.surbl.org/faq.html#opendns">Verizon and others are now modifying some DNS NXDOMAIN responses</a> in a way that causes what look like false positives on domains that are not blacklisted. They appear to be doing this to drive search traffic to other sites, but unfortunately it breaks DNS responses for SURBLs and other blacklists. Please check with your ISP if you are seeing DNS responses modified in this way. Verizon has an opt-out procedure with instructions on switching to DNS servers that do not change NXDOMAIN responses. Others such as Charter have opt-out nameservers that reportedly do not support NXDOMAIN. If so, then none of their nameservers may be compatible. One solution is to not use their nameservers. </font></p>

[quote user="Thomas R. Stephenson"]

New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain.

This makes me think that you are not getting the right answer from http://www.surbl.org/ since according to the BearHTML help it is querying this location for the data.  Here are some reasons for the false positives.  You are probably not using SpamAssassin but you might be using something that modifies how the DNS queries work. [snip][/quote]

You may be right about that.  I use Treewalk, but I was under the impression that it simply cached DNS for me - it is based on BIND.

I also use a Bayesian proxy called K9 to filter out my spam, and *it* checks senders againt an RBL, but it isn't in the DNS chain - it proxies the POP3 transaction.

At this point, I'm of a mind to just turn the feature off.  I'd be happy to help the bearhtml author hunt this down if he was so inclined, but I'm not going to insist on it.

Thank you both for playing!

[quote user="Thomas R. Stephenson"]<blockquote>New bit of info: It appears that *every* URL triggers this.  It doesn't matter what the domain. </blockquote><p>This makes me think that you are not getting the right answer from http://www.surbl.org/ since according to the BearHTML help it is querying this location for the data.  Here are some reasons for the false positives.  You are probably not using SpamAssassin but you might be using something that modifies how the DNS queries work. [snip][/quote]</p><p>You may be right about that.  I use Treewalk, but I was under the impression that it simply cached DNS for me - it is based on BIND.</p><p>I also use a Bayesian proxy called K9 to filter out my spam, and *it* checks senders againt an RBL, but it isn't in the DNS chain - it proxies the POP3 transaction.</p><p>At this point, I'm of a mind to just turn the feature off.  I'd be happy to help the bearhtml author hunt this down if he was so inclined, but I'm not going to insist on it.</p><p>Thank you both for playing! </p>

At this point, I'm of a mind to just turn the feature off.  I'd be

happy to help the bearhtml author hunt this down if he was so inclined,

but I'm not going to insist on it.

There is nothing that can be done by BearHTML to fix this if the DNS is returning the wrong answer to the query. According the people running the http://www.surbl.org/ list if what is returned by the list is a positive address then it's going to be a false positive.  The only option you have is to turn it off it seems.

 

<blockquote>At this point, I'm of a mind to just turn the feature off.  I'd be happy to help the bearhtml author hunt this down if he was so inclined, but I'm not going to insist on it.</blockquote><p>There is nothing that can be done by BearHTML to fix this if the DNS is returning the wrong answer to the query. According the people running the http://www.surbl.org/ list if what is returned by the list is a positive address then it's going to be a false positive.  The only option you have is to turn it off it seems.</p><p> </p>

[quote user="Thomas R. Stephenson"]There is nothing that can be done by BearHTML to fix this if the DNS is returning the wrong answer to the query. According the people running the http://www.surbl.org/ list if what is returned by the list is a positive address then it's going to be a false positive.  The only option you have is to turn it off it seems.[/quote]

It would seem so.  Still, I'd be curious to know what result bearhtml gets when it quizzes surbl that causes it to A) call any URL I click on blacklisted, and B) *not* display the 127.0.0.* address the author expected.  Do you know how to trace the conversation between bearhtml and surbl?

As I said, I am at the author's disposal if he would like my help to pursue it.

<p>[quote user="Thomas R. Stephenson"]There is nothing that can be done by BearHTML to fix this if the DNS is returning the wrong answer to the query. According the people running the http://www.surbl.org/ list if what is returned by the list is a positive address then it's going to be a false positive.  The only option you have is to turn it off it seems.[/quote]</p><p>It would seem so.  Still, I'd be curious to know what result bearhtml gets when it quizzes surbl that causes it to A) call any URL I click on blacklisted, and B) *not* display the 127.0.0.* address the author expected.  Do you know how to trace the conversation between bearhtml and surbl? </p><p>As I said, I am at the author's disposal if he would like my help to pursue it. </p>

[quote user="JGradyO"]

Do you know how to trace the conversation between bearhtml and surbl?

[/quote]

Check the logs of your DNS server (Treewalk)

BearHTML will only be talking to your DNS server, which will then pass the query to surbl, receive the reply, do whatever processing it is configured to, then pass a (modified?) reply to BearHTML.

Edit:

One of Treewalks listed features is:

- Reduce "not found", "404", and "DNS error" messages

This could be your problem, as a 'non-blacklisted' response is "not found" 

 

[quote user="JGradyO"]<p>Do you know how to trace the conversation between bearhtml and surbl? </p><p>[/quote]</p>Check the logs of your DNS server (Treewalk) <p>BearHTML will only be talking to your DNS server, which will then pass the query to surbl, receive the reply, do whatever processing it is configured to, then pass a (modified?) reply to BearHTML.</p><p>Edit:</p><p>One of Treewalks listed features is:</p><p>- Reduce "not found", "404", and "DNS error" messages </p><p>This could be your problem, as a 'non-blacklisted' response is "not found" </p><p> </p>

[quote user="dilberts_left_nut"][quote user="JGradyO"]Do you know how to trace the conversation between bearhtml and surbl?[/quote]

Check the logs of your DNS server (Treewalk)[/quote]

Easier said than done, but hey; in for a penny, in for a pound.  Treewalk makes its "logs" available through the debugprint facility, so one needs a debugger.  I used DBGview from the freeware site formerly known as SysInternals.

[quote user="dilberts_left_nut"]BearHTML will only be talking to your DNS server, which will then pass the query to surbl, receive the reply, do whatever processing it is configured to, then pass a (modified?) reply to BearHTML.[/quote]

Yes and no.  The Treewalk logs reveal that BearHTML asks DNS to resolve a fictional site consisting of the root domain of the URL + "multi.surbl.org".  If everything goes well, that request gets routed to surbl for processing, and it will determine that "ups.com.multi.surbl.org" doesn't exist.  BearHTML determines that non-existent == not spammy/phishy/otherwise malignant.  I presume that surbl returns 127.0.0.x with the last octet set to something meaningful, for domains that are blacklisted.

Turns out I had assigned a domain name (mshome.net), assigning value to the "Connection-specific DNS Suffix" via my wifi router DHCP, and "Primary DNS Suffx in my TCP/IP stack.  For some reason, DNS queries to for <anything>.mshome.net ended up resolving positively to the IP addy the ISP assigned to me.  If BearHTML would have appended a period to the end of its concatenation (i.e., ups.com.multi.surbl.org.), DNS would not allow the stack to go through the "DNS Suffix Search List", and append each to the domain being sought.

So:

ups.com.multi.surbl.org turned into ups.com.multi.surbl.org.mshome.net, and resolved to a real IP address.

ups.com.multi.surbl.org. would turn into ups.com.multi.surbl.org.mshome.net, and resolved to "not found".

BearHTML is apparently able to interpret only 127.0.0.x IP address results, and so was leaving the log in an uniformative state, after getting a positive result.

I've got to reboot to remove all vestiges of "mshome.net" from my config, but I'm pretty sure this will clean up this situation in *my* case.  BearHTML would *probably* be fixed if it were to append a period to the end of the domain it did a lookup on, in order to prevent the MS stack from doing the "DNS Suffix" concatenations.  It would be safe to do this, since we know that ".multi.surbl.org" should always be the anchor of the domain.

Right?

Grady

&lt;p&gt;[quote user=&quot;dilberts_left_nut&quot;][quote user=&quot;JGradyO&quot;]Do you know how to trace the conversation between bearhtml and surbl?[/quote]&lt;/p&gt;&lt;p&gt;Check the logs of your DNS server (Treewalk)[/quote]&lt;/p&gt;&lt;p&gt;Easier said than done, but hey; in for a penny, in for a pound.&amp;nbsp; Treewalk makes its &quot;logs&quot; available through the debugprint facility, so one needs a debugger.&amp;nbsp; I used DBGview from the freeware site formerly known as SysInternals. &lt;/p&gt;&lt;p&gt;[quote user=&quot;dilberts_left_nut&quot;]BearHTML will only be talking to your DNS server, which will then pass the query to surbl, receive the reply, do whatever processing it is configured to, then pass a (modified?) reply to BearHTML.[/quote]&lt;/p&gt;&lt;p&gt;Yes and no.&amp;nbsp; The Treewalk logs reveal that BearHTML asks DNS to resolve a fictional site consisting of the root domain of the URL + &quot;multi.surbl.org&quot;.&amp;nbsp; If everything goes well, that request gets routed to surbl for processing, and it will determine that &quot;ups.com.multi.surbl.org&quot; doesn&#039;t exist.&amp;nbsp; BearHTML determines that non-existent == not spammy/phishy/otherwise malignant.&amp;nbsp; I presume that surbl returns 127.0.0.x with the last octet set to something meaningful, for domains that are blacklisted. &lt;/p&gt;&lt;p&gt;Turns out I had assigned a domain name (mshome.net), assigning value to the &quot;Connection-specific DNS Suffix&quot; via my wifi router DHCP, and &quot;Primary DNS Suffx in my TCP/IP stack.&amp;nbsp; For some reason, DNS queries to for &amp;lt;anything&amp;gt;.mshome.net ended up resolving positively to the IP addy the ISP assigned to me.&amp;nbsp; If BearHTML would have appended a period to the end of its concatenation (i.e., ups.com.multi.surbl.org.), DNS would not allow the stack to go through the &quot;DNS Suffix Search List&quot;, and append each to the domain being sought.&lt;/p&gt;&lt;p&gt;So:&lt;/p&gt;&lt;p&gt;&lt;b&gt;ups.com.multi.surbl.org&lt;/b&gt; turned into &lt;b&gt;ups.com.multi.surbl.org.mshome.net&lt;/b&gt;, and resolved to a real IP address.&lt;/p&gt;&lt;p&gt;&lt;b&gt;ups.com.multi.surbl.org.&lt;/b&gt; would turn into &lt;b&gt;ups.com.multi.surbl.org.mshome.net&lt;/b&gt;, and resolved to &quot;not found&quot;.&lt;/p&gt;&lt;p&gt;BearHTML is apparently able to interpret only 127.0.0.x IP address results, and so was leaving the log in an uniformative state, after getting a positive result.&lt;/p&gt;&lt;p&gt;I&#039;ve got to reboot to remove all vestiges of &quot;mshome.net&quot; from my config, but I&#039;m pretty sure this will clean up this situation in *my* case.&amp;nbsp; BearHTML would *probably* be fixed if it were to append a period to the end of the domain it did a lookup on, in order to prevent the MS stack from doing the &quot;DNS Suffix&quot; concatenations.&amp;nbsp; It would be safe to do this, since we know that &quot;&lt;b&gt;.multi.surbl.org&lt;/b&gt;&quot; should always be the anchor of the domain. &lt;/p&gt;&lt;p&gt;Right? &lt;/p&gt;&lt;p&gt;Grady &lt;/p&gt;

Thank you for your research.  For your interest, the Bearhtml help lists the various values of "n" in 127.0.0.n  pointing to the service that qualifies the DNS lookup as normal or suspicious.

I haven't a clue as to what your "connection specific DNS suffix" is about, but I am willing to put in a period at the end of the DNS string, as a user option if it helps. 

Martin

&lt;p&gt;Thank you for your research.&amp;nbsp; For your interest, the Bearhtml help lists the various values of &quot;n&quot; in 127.0.0.n&amp;nbsp; pointing to the service that qualifies the DNS lookup as normal or suspicious.&lt;/p&gt;&lt;p&gt;I haven&#039;t a clue as to what your &quot;connection specific DNS suffix&quot; is about, but I am willing to put in a period at the end of the DNS string, as a user option if it helps.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Martin &lt;/p&gt;

Your DNS should look up exactly what you tell it, NOT what it thinks you really meant. In my opinion this is broken behavior, especially for a server situation.

Your DNS should look up exactly what you tell it, NOT what it thinks you really meant. In my opinion this is broken behavior, especially for a server situation.

Your DNS should look up exactly what you tell it, NOT what it thinks

you really meant. In my opinion this is broken behavior, especially for

a server situation.

Especially if you are querying any sort of black/white list server.   
&lt;blockquote&gt;Your DNS should look up exactly what you tell it, NOT what it thinks you really meant. In my opinion this is broken behavior, especially for a server situation.&lt;/blockquote&gt;Especially if you are querying any sort of black/white list server. &amp;nbsp;&amp;nbsp;

[quote user="dilberts_left_nut"]Your DNS should look up exactly what you tell it, NOT what it thinks you really meant. In my opinion this is broken behavior, especially for a server situation.
[/quote]

Everyone is entitled to an opinion.

By default, a DNS resolver will append its own domain on a query, to see if it gets a hit.  This is part of the RFC for DNS unless I'm mistaken, and certainly the behavior of every DNS I've managed. The official syntax for telling a DNS resolver to NOT do that, is to tell it you are being specific by punctuating your query with a period.  Otherwise, a DNS server in your own domain would not be able to answer positively when you ping (or nslookup, or telnet, or whatever) to "machinename" rather than "machinename.mydomain.com".

Treewalk *can* be a server, or it can just be a DNS cache; it really depends on how you configure it.  It behaves exactly like a Domain Name Server should behave, but I misconfigured A) my own machine, and B) my DHCP server (WiFi router).  I have not only reconfigured it so that I no longer cause BearHTML to malfunction, but also I have suggested a way for BearHTML to avoid the malfunction in the future, regardless of the Domain Suffix Search list and/or DNS behavior.

&lt;p&gt;[quote user=&quot;dilberts_left_nut&quot;]Your DNS should look up exactly what you tell it, NOT what it thinks you really meant. In my opinion this is broken behavior, especially for a server situation. [/quote]&lt;/p&gt;&lt;p&gt;Everyone is entitled to an opinion. &lt;/p&gt;By default, a DNS resolver will append its own domain on a query, to see if it gets a hit.&amp;nbsp; This is part of the RFC for DNS unless I&#039;m mistaken, and certainly the behavior of every DNS I&#039;ve managed. The official syntax for telling a DNS resolver to NOT do that, is to tell it you are being specific by punctuating your query with a period.&amp;nbsp; Otherwise, a DNS server in your own domain would not be able to answer positively when you ping (or nslookup, or telnet, or whatever) to &quot;machinename&quot; rather than &quot;machinename.mydomain.com&quot;. &lt;p&gt;Treewalk *can* be a server, or it can just be a DNS cache; it really depends on how you configure it.&amp;nbsp; It behaves exactly like a Domain Name Server should behave, but I misconfigured A) my own machine, and B) my DHCP server (WiFi router).&amp;nbsp; I have not only reconfigured it so that I no longer cause BearHTML to malfunction, but also I have suggested a way for BearHTML to avoid the malfunction in the future, regardless of the Domain Suffix Search list and/or DNS behavior. &lt;/p&gt;
live preview
enter atleast 10 characters
WARNING: You mentioned %MENTIONS%, but they cannot see this message and will not be notified
Saving...
Saved
With selected deselect posts show selected posts
All posts under this topic will be deleted ?
Pending draft ... Click to resume editing
Discard draft