• WShrboyce

    WShrboyce

    @wshrboyce

    Viewing 15 replies - 1 through 15 (of 23 total)
    Author
    Replies
    • in reply to: Move over, Windows; Google and Apple are movin’ in #1459798

      I see a couple thoughts on your analysis and presentation.

      First is you seem to lead the reader to believe with those sales figures for apps at Apple and Google that Microsoft has no sales at all which totally ignores the sales for Office 365 that completely dwarf the “stupid cat picture” apps that get sold to Android and Apple (and Windows) phone users to create those numbers. Not many of these are sustainable income sources. Good apps have a hard time making it on any of those platforms because the market seems to be for “short time consumable” applications. Look at your own phone, how many of the apps you have loaded are purchased and how many do you use more than once a week after 30 days. On the other hand O365/OneDrive/etc will have customers returning based on a value proposition. It will be interesting to see if that becomes a “churn” business like TV cable is now.

      I was surprised you passed on the CURIOUS choice of words in the Nadella quote:
      ” I’m going to say… any thinking consumer should consider Microsoft because guess what, you’re not just a consumer.” In my opinion this is troublesome mostly because thinking consumers is a really small part of the devices market and Microsoft may already have a large part of them. Most people do not buy phones or any consumer computing device after going thru a deep thoughtful purchase decision. More often than not it seems the decision is based on emotional appeal and marketing ‘cool’. Those kinds of thinking decisions rarely seem to happen out side the enterprise.

      Finally from where I sit it seems like Windows may have to move over some but more importantly I think it will also move forward. I don’t believe the Windows market has shrunk as much as is commonly assumed and now MS is moving to new markets with the Windows brand where it does not have a dominant position. They never really got to the vision of a PC in every home and now they are adding a device in every hand buying Microsoft services. That is the point, Windows no longer is the Windows of version 3.11 we cut our teeth on. It too is more of a brand than a single OS/user interface these days. I find my self wondering if there will be a product/solution in the future, like Windows was then, that will bring sanity to the chaos of computing on the web… you remember… we had a different printer driver for every app, a different set of CRTL keys to learn for every app, nothing worked together well, data transfers were difficult (lotus to dbase and back or word to displaywrite) Fast forward 30 years and the problems have all been re-invented – its like the days of DOS, OS/2, UNIX, WordPerfect, DBase, and Lotus only now it is the WWW and not SNEAKER NET. Microsoft moved into that PC world with Windows and became the dominant player, is Nadella trying to do the same on a new platform called the internet?

    • in reply to: Going Google (apps), Part 1: Move your mail #1388923

      Woody
      I can’t imagine why anyone would even consider “Going Google” given their record on privacy in general and their apparent lack of concern for the privacy rights of individuals and the contents of their email. In my opinion they abandoned their “do no evil” motto a long time ago and went over to the darks side. Their transgressions in this regard actually caused me to move in the opposite direction of your recommendations. I had a Google domain and used GMAIL for email for a number of years and finally moved it to Microsoft live domains based primarily on privacy concerns even though I felt I lost a number of management features. Their deliberate missteps in the regard eventually led to Google’s privacy czar leaving or being forced to step down but they have said nothing about changing the attitudes or policies that led to this action. It is almost as if she was a scape goat. The only reason I still maintain a GMAIL/Google account is because I own a tablet which runs Android. That experience has convinced me I will never own another Android device or phone given how easy they make it for me to unwittingly give up my personal information and contacts so advertisements can be sent my way to pay for the “FREE” apps I use. The sad thing is if I pay for the Apps I often must make the same privacy concessions anyway but I just don’t have to suffer the ads. Microsoft’s Scroogled campaign has hit a nerve because it defines what I experienced and how I feel. For me GMAIL has become my Junk mail account where email marketing goes to die and with recommendations like these I may soon be sending Windows Secrets there as well.

      Nothing is ever free, be careful what you give up to get something you think costs you nothing. As a purveyor of a newsletter the Windows Secrets folks should understand that better than most. When my tablet finally dies so will the last of my connections with Google, it remains to be seen if Windows Secrets goes with it and I don’t say that lightly as I have “followed” you since the days of WOPR.

      With Apologies to the late Robert Palmer, for me, Google is simply resistible…

      HR

    • in reply to: Should your personal computer be quarantined? #1213110

      I find it hard to believe I am saying this but I think I agree with Microsoft’s position on this as long as it is properly managed and not abused by the regulating authority.

      I have been a broadband user for a long time and used to monitor the hits on my firewall router. I could not believe the amount of virus traffic my router stopped. I used to report some of the constant offenders to my provider at the time (Comcast) and ask why they permitted it. I also wondered why they did not stop most of it (what they could) at their routers. I found it rather ironic some time later when the sent me a letter stating they were going to start charging customers based on how much bandwidth they used. I replied saying that the first bill I got for extended use would be the last one I paid and that if they cleaned up there network they probably could provided higher speeds and more bandwidth for all. At that time I decided to get another provider. Since then they now offer free (ahem – included in the cost) anti-virus, something Verizon still charges extra for. And as far as I know they have not implemented surcharges for heavy users. I think all of this is coincidence, I am not claiming to have affected a change at Comcast, I think good business sense just prevailed.

      Implemented correctly blocking could easily be treated like any other utility, you get a warning, then a notice, then you are offlined or DMZ’d until you are cleared.

      The ISP is a private network not a public trust, I get to use it because I am willing to pay and they are willing to sell. From a business perspective I believe they would within their rights to block malicious traffic. If I don’t like that I can take my business elsewhere. If/when wireless access becomes ubiquitous and companies like AT&T struggle to meet the demand of their customers we should expect, maybe even require them to make sure they block those users who are being abused by the botnets or allow their systems to be infected with malware.

      hr

    • in reply to: Macromedia MX – Pros vs. Cons #671939

      I have never used CF to access DB2 but it sounds like you have at least one web site that does that already. The MX user interface will be a learning curve to over come when compared to the OLD CF editor (or worse notepad). I think you’ll find the syntax checking to be excellent and the Macromedia version of intellitype works pretty well. If you are truly going to use FLASH for data collection, retrival, and updates and not just eye candy then the learning curve went through the roof unless you have experience with C# or Java and XML. Action Script is very similar to Java but the FLASH interface presents some additional challenges on how to organize the user interface assets and the code behind the movies which make it all happen.

      I would highly recommend visiting http://www.macromedia.com/devnet/%5B/url%5D as well as investing in FIREFLY and the data connection kit for $299. if your time is worth just $10/hr you could easily save more than 30 hrs just trying to figure out how to put all the pieces together by working with the supplied samples and working components supplied with the kit. Macromedia used to give some of this stuff away for free. Now they have worked it into an supported product and sell it as an add-on. If I had a paying customer to cover the cost I would be able to tell you more about it. 299 is more ‘play’ money than I can invest right now.

      You also might want to look around and see if there are other CF specific examples, they aren’t as good as MS about giving away code examples but they are getting better.

      Just my 2 cents
      HR

    • in reply to: Publishing FP2000 Problems (FP 2000 (2000 SExt)) #671875

      Thanks for the great reply…

      By now you may have your suspicions that I am not the ‘webmaster’ for this site and you would be right. I have been asked to to help get things straightened out. I helped set up the web server almost 3 years ago and it has been user maintained and managed ever since.

      Thanks for the heads up on the publishing direction issue. I have successfully published backward, then foward again, with small sites (less than a 100 pages) where I was in full control and new that I was the only editor/publisher and might have tried it, but I have never attempted it with a site as large as this.

      I am currently analyzing the layout of the web site to see how sub webs could best be used. The directories are currently at least SIX layers deep in some places and links in lower directories often refer back to items and images just off the root. I have seen posts regarding trouble with nested websites, hopefully they mean the same page/file being served up by more than one root (not subwebs). If the images live in the parent web will the subweb treat them like an external link? If yes then a lot of links could be broken until a fully qualified URL replaces the existing relative link. Any thoughts /experiences?

      Just to further complicate matters, this web site is about to move to new 2000 servers (in a couple of weeks). My current inclination is to publish from the existing PRODUCTION site to a NEWTEST site. Make the NEWTEST site be what it needs to be (clean up any unwanted files, make sure all links work, and implement any subwebs if they are to be used) and then publish to the NEWPROD site from the NEWTEST. I have been reading about the FP 2002 / STS extensions (are they really different?) and how they have been re-archetected to perform better and be more stable. The current servers have FP2000 SExt and the new servers will probably have at least FP2002 if not STS. Any thoughts?

      One last question, (for this post anyway dizzy ):

      IF the site is designed as:

       
      [ROOT 1] / DIR
               / DIR
               /[SUBWEB A] / DIR
               / DIR       / DIR
                           / DIR
                           / [SUBWEB B] / DIR
                                        / DIR
                                        / DIR
                                        / DIR
      

      and ROOT 1 is published, does publishing stop at the SUBWEB A level and if SUBWEB A is published it starts there and stops at SUBWEB B?

      You’ve put a lot of effort into this thread and I really appreciate it. Hopefully it will become one of those you “bookmark” for future reference. I promise I will keep you and the lounge updated on how this all goes, that is if there is any intrest on your part in hearing about it.

      Thanks once again
      HR

    • in reply to: Publishing FP2000 Problems (FP 2000 (2000 SExt)) #671703

      Thanks for your reply….
      In my scenario it would seem that the _vti_cnf would be a first step. I am concerned though that if they are deleted from both the production and the test sites that they will be out of sync if in fact any synchronization of the info contained in them occurs during publishing. Based on the other posts I would make the assumption that these files could be used as the mechanism to determine which files to update (based on either date time stamp or total content) do you know if this is true.

      This leads to some other questions..
      1 What happens if content is placed on the web NOT using FP and the publishing extensions. (I ask this cause I have some suspicions that short cuts get taken due to the time required to publish)

      2. If sub webs are created in the directory structure of the main web is there contention over the _vti_cnf content.

      3. Can subwebs be used in DEV and TEST environments but not in Production (current method of publishing is import from dev to test, publish from test to PROD) it would seem that the prod would have to mirror Test including the subwebs

      4 In one of the posts you mentioned that security for subwebs could be different than the main web. Was that true of FP2000 server Ext or was that in reference to FP2002 SExt I believe it certainly would be easier to maintain in FP2002.

      5 Has MS every posted or stated a best practice for the size of a FP managed website?

      I have about 4 more but I’ll stop here for now

      Thanks in advance for the time and energy you put into this post and all the others. I did search before hand but had difficulty, most threads are probably named as badly as this one 😉

      Thanks again
      HR

    • in reply to: ODBC #531447

      I found the following at the provided link:

      ______________________________________
      Dim oConn As ADODB.Connection
      Dim oCmd As ADODB.Command
      Dim oRS As ADODB.Recordset
      Dim iTotalConflictingRecords As Integer

      ‘ Create and open a new connection to the Pubs database
      Set oConn = New ADODB.Connection
      oConn.Open “Provider=sqloledb;” & _
      “Server=(local);” & _
      “Initial Catalog=pubs;” & _
      “User ID=sa;” & _
      “Password=;”

      ‘ Create a Command object and a it’s Parameter object
      Set oCmd = New ADODB.Command
      oCmd.ActiveConnection = oConn
      oCmd.CommandType = adCmdStoredProc
      oCmd.CommandText = “authors_load”
      oCmd.Parameters.Append oCmd.CreateParameter(“au_id”, adChar, adParamInput, 11, “172-32-1176”)

      ‘ Create and open an updateable Recordset (passing in the Command object)
      Set oRS = New ADODB.Recordset
      oRS.CursorLocation = adUseClient
      oRS.Open oCmd, , adOpenStatic, adLockBatchOptimistic

      ‘ If author record was found
      If Not oRS.EOF Then
      ‘ do something with the recordset
      End If

      __________________________________________

      But it appears to have some constants that are unexplained and I don’t need or want an updatable recordset. A snapshot(?) with no record locking would be more appropriate for my use since the resulting recordset will be placed in a temporary Access table.

      Any suggestions?

      It would also seem that the workstations using this type of SQL connectivity would have to have MDAC and a SQLServer provider installed on each one and I hear that its very difficult to determine the installed version to insure compatibility. Is this true also

    • in reply to: ODBC #531411

      Actually I am using a little utilized feature of ODBC called ODBC Direct which lets me execute a stored procedure and return a record set. Permission can then be set on the stored proc for the ODBC user. Obviously I am not using SA, I said that the user login contained in the code would be severely limited so it couldn’t be SA. I have a mixed user group of Access 97 SR2 and Access 2K SR1. I can code exclusively for Access2kSR1 if its the only way but would like not to exclude those that haven’t been migrated yet. I have everything working currently but the specified ODBC connection must be installed on each workstation.

      Can you explain the ADO thing or point to a reference.

    • in reply to: ODBC #531256

      Charlotte,
      Is there a way to include everything needed to create an ODBC connection in an access module so that the connection does not have to be configured on each workstation. I know a file DSN can be used but they get tricky with hard drive paths etc as well. I am hopping that something can be created like works in an ASP page. I am not concerned about putting the user id and password in the module as that particular user will be severely limited by the SQL Server permissions anyway.

      Thanks in Advance
      hr

    • in reply to: How can I use the Outlook GAL in a Word Mail Merge #524228

      thumbup Thanks to all…
      MS Access is able to get to the info we need which currently resides in the GAL. And its easy enough to program the mail merge they want. I will look at migrating that info into a new public folder which will be easier for the user community to maintain. They say the list does not change that much (its a recent Exchange migration which we have inherited so its tough to make such a determination) but time will tell.

      If anyone has any tips on the best way to accomplish such a migration I would be interested in hearing from you or if you recommend that a new thread be started on that topic I will understand.

    • in reply to: How can I use the Outlook GAL in a Word Mail Merge #523567

      I guess PF can stand for more than Personal Folder (oops). Hadn’t thought about a public folder, will have to consider that. Most of the 500 or so addresses are already in the GAL which is why I was going after it.

      Why do you see it as an advantage to use a public folder instead of the GAL. Both are central repositories but how would you choose to put a ‘contact’ in one and not the other and why? Regarding how much its used I would say fairly constant, most of the 10 employees are working contacts, sending info either by email, fax or snail mail or they are soliciting donations from members etc. Not enough to cause the server a problem with a public folder but if it was a personal folder forget it!

      HR

    • in reply to: How can I use the Outlook GAL in a Word Mail Merge #523552

      The problem with using a personal folder with contacts is that only one person in the office can have access to it at a time. Sharing PFs can lead to all sorts of disasters, especially as they grow in size. I have seen servers brought to their knees by shared PFs. If access can link to the GAL then it “should” be a relatively easy task to write a macro that empties a table, runs a query against the GAL and appends the records in the local access table. That table can then be used to do a mail merge in Word. ‘At the press of a button’ it should be a repeatable process which is what the doctor(client) ordered. Being able to update the GAL thru Access sounds like risky business at best that will have to be done in a lab for awhile before I try it on site. Translated it will probably never happen.

    • in reply to: How can I use the Outlook GAL in a Word Mail Merge #523416

      The Exchange server is used by a small non profit with about 10 employees. But like most non profits they want the GAL to contain a membership list with email and snail mail addresses as well as other contact info. I might(?) be nice if Access is two way and a form could be created to allow them to add contacts to the GAL so that an administrator (me as a contractor) need not be required. Edit checks and all could be placed on the form and the user would be kept out of the exchange internals where they could do much damage.

      Once I get onsite and get Access to access the GAL I will post back. I didn’t have any luck on my corporate GAL with many thousands of listings but wasn’t sure if it was a time-out issue or a permissions issue. I was able to create the link, but getting any data was an impossible wait.

      H

    • in reply to: How can I use the Outlook GAL in a Word Mail Merge #523273

      That appears to be the ticket. joy We’ll be testing it soon. BTW do you know any ‘permissions’ issues on the GAL. As you say its a link and live does that mean that names could be added this way???

      Thanks again
      HR

    • in reply to: Getting the NT login ID in an Access form #521946

      Rob
      Actually you still might want to think about the front-end back-end idea. Here is why…. typically its best to put your user interface elements in one MDB file and then keep the data in another and use Linked tables to get to the data. The real advantage is that if you have to update the application (forms, macros, querries, reports, etc) you don’t have to worry about keeping data in sync. Also if you distribute the file to the desktop machines it will load faster and your server and network guys will thankyou joy. I know its a pain to distrbute that way but hey attach it to an email (the mail guys will hate that mad). Just remember to use a UNC (servernamesharename) to identify the location of the MDB file holding the tables. DO NOT use a mapped drive unless you are absolutely sure everyone who needs to use the app will use the same drive letter.

      Glad the permissions thing solved your delema.
      virtually
      HR

    Viewing 15 replies - 1 through 15 (of 23 total)