Does this support require Make to be linked against the UCRTrun-time library, or does it also work with the older MSVCRT?
I haven't found anything explicitly mentioned about this in the official doc:
Also, it is possible to apply the manifest even post-compilation of the
executable, using mt.exe (MS standard workflow) on it, so it shouldn't
matter if it is linked against either one because it can be done even
after the link phase. Not sure if that's a convincing argument though.
If Make is built with MSVC, does it have to be built with some new
enough version of Studio to have the necessary run-time support
for this feature, or any version will do?
I haven't built Make with MSVC at all (patch is focused on building with
GNU tools) but again there is no mention of this in the official doc above.
It is just another case of using a manifest file, where this time the manifest
is used to set the active code page of the process to UTF-8.
In fact, the manifest can be embedded into the target executable even
post-compilation, using mt.exe, so I don't think a recent version of VS
is a requirement to build properly.
Does using UTF-8 as the active page in Make mean that locale-dependent
C library functions will behave as expected?
I think so. Here is the relevant doc I found:
where the interesting bits are those where "operating system" is mentioned, like:
"Also, the run-time library might obtain and use the value of the operating system
code page, which is constant for the duration of the program's execution."
I believe with setting the active code page of the process to UTF-8 we
are effectively forcing the process to think that the operating system
code page is UTF-8, as far as that process is concerned.
Did you try running Make with this manifest on older Windows systems,
like Windows 8.1 or 7? It is important to make sure this manifest doesn't
preclude Make from running on those older systems, even though the
UTF-8 feature will then be unavailable.
I did not try as I don't have access to such systems, but it seems pretty
clear from the doc that this should not be a problem:
"You can declare this property and target/run on earlier Windows builds, but you must handle legacy code page detection and conversion as usual. With a minimum target version of Windows Version 1903, the process code page will always be UTF-8 so legacy code page detection and conversion can be avoided."
It sounds like it will simply not use UTF-8, meaning that any UTF-8 input
would still cause Make to break, but that would happen anyway with such
input. Based on the above, it shouldn't change existing behavior in these
older systems, and certainly not stop Make from running on them.
When Make invokes other programs (which it does quite a lot ;-),
and passes command-line arguments to it with non-ASCII characters,
what will happen to those non-ASCII characters?
I think your expectation is correct. Windows seems to be converting the UTF-8
encoded strings to the current ANSI codepage, therefore allowing non-ASCII
characters (that are part of that ANSI codepage) to be propagated to the
non-UTF-8 program.
Below are some experiments to show this.
In what follows, 'mingw32-make' is today's (unpatched) Make for Windows, as
found in a typical mingw build distribution. Since it is unpatched, it is using
the local ANSI codepage which is windows-1252 in my machine.
'make' is the patched version which uses the UTF-8 codepage.
hello :
<TAB>gcc ©\src.c -o ©\src.exe
where the (extended ASCII) Copyright sign has been used (0xA9 in 1252).
Makefile '
utf8.mk' has the same content but is encoded in UTF-8, so the
Copyright sign is represented as 0xC2 0xA9 (two-byte UTF-8 sequence,
confirmed by looking through hex editor).
With the unpatched Make that uses the local codepage:
works fine and produces the .exe under the copyright folder (current behavior).
breaks because the unpatched make can't understand the UTF-8 file (expected).
With the patched Make that uses the UTF-8 codepage:
breaks because Make expects UTF-8 and we are feeding it with a 1252 file.
works fine and produces the .exe under the copyright folder.
I believe this last case is the one that answers your question:
Make (now working in UTF-8) calls gcc (working in 1252) with some UTF-8
encoded arguments. gcc has no problem doing the compilation and
producing the executable under the Copyright folder, which suggests that
Windows did indeed convert the UTF-8 arguments into gcc's codepage (1252),
and because the Copyright sign does exist in 1252 the conversion was
successful, allowing gcc to run.
So it doesn't look like this change is disabling non-ASCII argument support
in programs called by Make. They maintain whatever characters are available in
their local codepage (1252 in this case), and since UTF-8 covers the entire Unicode
spectrum, it doesn't seem like we are losing any currently working scenarios.
So this feature will only be complete when the programs invoked by Make are
also UTF-8 capable.
I agree. Make is the gateway to many programs and it can't control what they
do and how they work internally. But by working in UTF-8 itself, it is at least
giving those other programs the chance to work in UTF-8, so it's not blocking them.
In other words, if those programs could work in UTF-8 but not Make itself, they
wouldn't even get the chance to run because Make would break first before even
calling them.
Also, since the above experiments seem to suggest that we are not dropping
existing support for non-ASCII characters in programs called by Make, it seems
like a clear step forwards in terms of Unicode support on Windows.
But you are right, if those programs themselves don't support UTF-8, they are
just going to error out when faced with full UTF-8 arguments (that don't map to
anything in their legacy encoding), but that will be an error on their side, not Make.
It is OK to alter the general Makefile.am files, but please note that
Make for Windows is canonically built using the build_w32.bat batch
file; building using the Unix configury stuff is an option not
currently directly supported by the project (although I believe it
does work).
I cross-compiled Make for Windows using gcc (mingw-w64) and the
autoconf + automake + configure + make approach, so it clearly worked
for me, but I didn't imagine that this wasn't the standard way to build for
Windows host.
Does this mean that all builds of Make found in the various build distributions
of the GNU toolchain for Windows (like mingw32-make.exe in the examples
above) were necessarily built using build_w32.bat? I suppose not necessarily
because these build distributors could be doing something similar to what I did
using the Unix-like build approach. If yes, then they could benefit from the
patch as-is. If they do it using the build_w32.bat file, then they won't see a
difference until the patch gets applied there as well.
Since build_w32.bat is a Windows-specific batch file, does this rule out
cross-compilation as a canonical way to build Make for Windows?
Assuming all questions are answered first, would it be OK to work on the
build_w32.bat changes in a second separate patch, and keep the first one
focused only on the Unix-like build process?
Thanks,
Costas