IFRAME SYNC
IFRAME SYNC
IFRAME SYNC
IFRAME SYNC

Is there a way to avoid uniform convergence in Urysohn's extension theorem?

In the remarkable book Rings of Continuous Functions, by Gillman and Jerison, I came across with the following theorem:

Urysohn's Extension Theorem (UT). A subspace $S$ of $X$ is $C^*$-embedded in $X$ if and only if any two completely separated sets in $S$ are completely separated in $X$.

The proof goes as usual, with the construction of a sequence of continuous functions that converges uniformly to the desired function. Then, after proving Urysohn's Lemma (for normal spaces), UT yields Tietze's Theorem (TT).

On the other hand, there are some ways to prove TT without mentioning uniform convergence at all. For instance, as done by Scott here.

Since UT is a generalization of TT, I wonder if there is a way to prove UT without using uniform convergence. I tried to adapt Scott's argument to the general setting of UT, but I could not avoid the need for normality.



from Hot Weekly Questions - Mathematics Stack Exchange

Post a Comment

[blogger]

Contact Form

Name

Email *

Message *

copyrighted to mathematicianadda.com. Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget

Blog Archive