{"id":127,"date":"2015-08-20T16:01:17","date_gmt":"2015-08-20T16:01:17","guid":{"rendered":"http:\/\/zewwy.ca\/?p=127"},"modified":"2018-01-13T18:27:06","modified_gmt":"2018-01-13T18:27:06","slug":"schannel-fatal-alert-70-on-exchange-server-2010","status":"publish","type":"post","link":"https:\/\/zewwy.ca\/index.php\/2015\/08\/20\/schannel-fatal-alert-70-on-exchange-server-2010\/","title":{"rendered":"Schannel Fatal Alert (70) on Exchange Server 2010"},"content":{"rendered":"<p>So I use Zenoss for centralized system monitoring, including everything from network devices, ESXi hosts, all the way to end server such as Windows VM&#8217;s using WMI.<\/p>\n<p>As I receive a flood of events from SharePoint and <del>it&#8217;s child service<\/del>  a terrible workflow server add on called K2 Blackpearl, I ignored my Zenoss server quite a bit. I did clean up my other servers pretty well. So when I noticed this alert on my Exchange server, I wasn&#8217;t too happy. I like clean event logs in most of the servers I manage. (I&#8217;ve made an expectation to SharePoint and K2 since there a whole mixed bag of service accounts and permissions, and web parts&#8230; so many moving parts, I simply don&#8217;t care about their events.. Given there are no issues)<\/p>\n<p>So I set out to figure out what was causing this event&#8230; usual googling came up with the usual TechNet articles of those claiming it probably just re-associating to another acceptable protocol and to accept it. And as per usual whenever people can&#8217;t figure out why a event is triggering but doesn&#8217;t seem to affect production: &#8220;you can just ignore it, or disable SChannel events&#8221; This is not good enough for me, as it clearly indicates an issue going on in the back end.<\/p>\n<p>Digging further I came across <a href=\"http:\/\/blogs.msdn.com\/b\/kaushal\/archive\/2012\/10\/06\/ssl-tls-alert-protocol-amp-the-alert-codes.aspx\">this<\/a> tid bit of info. Using this info I knew it was a protocol version issue, with SSL and since it&#8217;s on my exchange server I had an itchy suspicion it was ActiveSync related.<\/p>\n<p>Installing Wireshark onto the server and running it with the SSL filter in place, I sure enough was able to pin point the device triggering the events. My boss&#8217;s Note 4 running Android 5.01 using the native mail app. At first I simple went into his exchange settings (just to note that it would work externally but not internally) and unchecked SSL (caused Auth to fail as expected), then re-enabled SSL. At first this seemed to make his ActiveSync work and I figured the events would go away, they did not, checking Wireshark it was still from his phone.<\/p>\n<p>To Paraphrase to solution:<\/p>\n<pre>1) Remove the corporate email account from the device. (Completely)\r\n2) Re-add the account to the device.\r\n<\/pre>\n<p>So that&#8217;s it! Since doing that I haven&#8217;t received any other SChannel fatal error (70). I hope this helps other that come across the same events in their Exchange environment. Just note this was on Exchange 2010 SP3 RU 10.<\/p>\n<p><em><strong>Jan 2018 Update<\/strong><\/em><\/p>\n<p>Got to love event logging. See so much, but sometimes, so much can drown you. Just have to take care of the ones you can when you can.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>So I use Zenoss for centralized system monitoring, including everything from network devices, ESXi hosts, all the way to end server such as Windows VM&#8217;s using WMI. As I receive a flood of events from SharePoint and it&#8217;s child service a terrible workflow server add on called K2 Blackpearl, I ignored my Zenoss server quite &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/zewwy.ca\/index.php\/2015\/08\/20\/schannel-fatal-alert-70-on-exchange-server-2010\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Schannel Fatal Alert (70) on Exchange Server 2010&#8221;<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"sfsi_plus_gutenberg_text_before_share":"","sfsi_plus_gutenberg_show_text_before_share":"","sfsi_plus_gutenberg_icon_type":"","sfsi_plus_gutenberg_icon_alignemt":"","sfsi_plus_gutenburg_max_per_row":"","footnotes":""},"categories":[8],"tags":[],"class_list":["post-127","post","type-post","status-publish","format-standard","hentry","category-server-administration"],"_links":{"self":[{"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/posts\/127","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/comments?post=127"}],"version-history":[{"count":1,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/posts\/127\/revisions"}],"predecessor-version":[{"id":128,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/posts\/127\/revisions\/128"}],"wp:attachment":[{"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/media?parent=127"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/categories?post=127"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/zewwy.ca\/index.php\/wp-json\/wp\/v2\/tags?post=127"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}