• WSDavidHLevin

    WSDavidHLevin

    @wsdavidhlevin

    Viewing 15 replies - 31 through 45 (of 69 total)
    Author
    Replies
    • in reply to: Query to analyze contents of memo fields #1557986

      Hi Mark,
      I was wondering whether your revised SQL accounts for a space between a semicolon and the colour following it.

      Dave

      (Edited to remove a bunch of stupid stuff I wrote.)

    • in reply to: Query to analyze contents of memo fields #1557911

      “This part of the application is actually to do with selecting personnel for various tasks requiring various skills.”

      A consequence of this is that if any skill is a substring of at least one other skill, as in “typing” and “light typing”, then the elegant (in my opinion) solution suggested by MarkLiquorman might sometimes fetch skills that aren’t germane.

      Dave

    • in reply to: Query to analyze contents of memo fields #1557486

      Hi Murgatroyd,
      Having to parse data that’s fetched from the Item table could have been avoided if the Item table had the same columns as the desired result of your Step 1, that is, if it consisted of the fields Item, Memo, and Color. For example, two of the rows would have been and .

      Was there a reason for the Item table’s having its actual layout?

      Dave

    • in reply to: How to convert an EML file to a JPG file? #1556639

      “Is Snagit a proven safe website?”

      I don’t know how we can ever be sure that a website that is “safe” today will not be infected tomorrow. But the product (Snagit) and its manufacturer (TechSmith) have apparently been around for decades.

      You could do an internet search on “snagit user reviews” and factor the results into your assessment of safety.

      Dave

    • in reply to: How to convert an EML file to a JPG file? #1556506

      Hi PCL,
      Snagit is claimed to do what you described.

      https://www.techsmith.com/snagit.html

      Hope this helps,
      Dave

    • in reply to: Finding duplicates, and removing duplicates #1548584

      Hi jlwood44,
      I agree with Rui that “find duplicates” is apparently being used here in a way for which it wasn’t really intended.

      Also, the tables might have been designed to be more robust. Suppose a player had a given three-game FTM sequence occur more than once in a season, such as 12 in game 3, 9 in game 4, and 10 in game 5 and then 12 in game 17, 9 in game 18, and 10 in game 19. This could be accommodated if an additional field that I’ll call g1_number were in the table that contains the FTM triplets. This would allow and to coexist.

      These entries wouldn’t be “duplicates”; they would be in conformance with the table layout. This representation would show where a player achieved a given three-day total more than once in a season, but you could still obtain one query row by including the appropriate criteria in the select statement, and your select statement could specify that the results were to be sorted descending by year.

      Perhaps if you posted the present tables and explained what the FTM data are used for, we would be able to offer more specific suggestions.

      Dave

    • in reply to: Finding duplicates, and removing duplicates #1548545

      Hi jlwood44,
      I’m not sure why you mention “index,” the purpose of which is to allow the database management system to more quickly fetch database rows that match the query criteria. A database would have to be quite large before the overhead of creating an index would be outweighed by the benefit.

      Also, perhaps you could more clearly explain what you consider “duplicate,” that is, what field(s) you want to essentially be the unique identifier. Would it be player and year, so that there would be only one result for , namely, the result where free throws made over the three games was highest for that combination of player and year? If there were several rows for a given player and year that had the same three-game sum (as for ), which one would you want selected?

      It might help also if you provided the query statement(s) you’re using.

      Hope this helps,
      Dave

    • in reply to: Shortest Distance #1541244

      The line segment from the midpoint of the diameter to the midpoint of the chord forms one leg of a right triangle whose other leg goes from the midpoint of the chord to one end of the chord and whose hypotenuse is r centimeters. I believe that the length of the former leg would be the square root of (r**2 – (2/3r)**2), or r/3 times the square root of 5, centimeters.

    • in reply to: Preventing Duplicate Entries in Continuous subform #1541206

      That’s great. Would you like to share your solution here, in case someone else encounters the problem and looks in this thread for help.

    • in reply to: Preventing Duplicate Entries in Continuous subform #1541043

      Hi simsima,
      Welcome to the forum.

      Just to be sure the problem is clear, could you confirm which of the following scenarios (each starting from a “clean slate”) trigger the msgbox.

      A. item_no_IM 0 followed by item_no_IM 0
      B. item_no_IM 0 followed by item_no_IM 1
      C. item_no_IM 1 followed by item_no_IM 0
      D. item_no_IM 1 followed by item_no_IM 1

      Dave

    • Hi Lady-Laughsalot,
      If a document has been impeccably formatted (e.g., no bold phrase has a leading bold space or trailing bold space), a macro having the following logical structure might work.

      # Start
      Go to top of document
      Search down for bold character
      Repeat until Not Found
      [INDENT]Insert quotation mark just before character
      Search down for non-bold character
      Search up for bold character
      Insert quotation mark just after character
      Search down for bold character[/INDENT]
      End Repeat
      # End

      If two or more “consecutive” table cells might be bold, the above logic might fail to insert the “intermediate” quotation marks, in which case you might then run a macro that’s something like the following.

      # Start
      For all tables
      [INDENT]For all cells
      [INDENT]If the first character in the cell isn’t a quotation mark
      [INDENT]Then insert a quotation mark before the character[/INDENT]
      If the last character in the cell isn’t a quotation mark
      [INDENT]Then insert a quotation mark after the character[/INDENT][/INDENT]
      End for[/INDENT]
      End for
      # End

      Hope this helps,
      Dave

    • in reply to: Pairing up to form perfect squares #1516705

      I wasn’t sure whether this fairly lengthy chain of reasoning is what you’re looking for. Anyway, in trying to determine the pairings, 18-7, 17-8, and 16-9 are forced (being that 18+17 falls short of 36). (I soon realized that 15 through 10 could be paired with 1 through 6, respectively, but that doesn’t prove that there are no other solutions, hence the following.) This forces 2 to be paired with 14 (because 7 is taken), which in turn forces 11 to be paired with 5 (because 14 is taken), which in turn forces 4 to be paired with 12 (because 5 is taken), which in turn forces 13 to be paired with 3 (because 12 is taken), which in turn forces 6 to be paired with 10 (because 3 is taken), which leaves 1 and 15.

    • in reply to: Grouping fields in report header #1508846

      Hi Murgatroyd,
      From your description, it seems that NotesDateTime is dependent solely on OrderID. Therefore, even though NotesDateTime is “about” the notes rather than the order, I’d be inclined to put it in the Orders table rather than the Notes table, which not coincidentally would address the issue you’re having with where that information appears on the report.

      Dave

    • in reply to: Help with PHP #1507761

      Hi Slovey,
      I see that your original post contains

      TestDropDown.htm code

      but your php script fragment is preceded by “PHPTest.php code”.

      I wasn’t sure if these names were literally those being used in your test environment.

      Dave

    • in reply to: Solid state drives can lose data (True?) #1505764

      Posts on this forum aren’t subject to formal peer review with respect to their substantive content. If the lack of formal peer review were sufficient reason to discount even an informed opinion, then I’m not sure what value there would be in this forum.

      Dave

    Viewing 15 replies - 31 through 45 (of 69 total)