Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues using ShadowDom and ShadowRoot elements #1784

Closed
mgol opened this issue Oct 21, 2014 · 33 comments
Closed

Issues using ShadowDom and ShadowRoot elements #1784

mgol opened this issue Oct 21, 2014 · 33 comments

Comments

@mgol
Copy link
Member

mgol commented Oct 21, 2014

Originally reported by rictic at: http://bugs.jquery.com/ticket/15201

JSBin link to reproduce:  http://jsbin.com/tamugo/2/edit (try in Chrome (or Firefox with dom.webcomponents.enabled turned on))

An element that is in the shadow dom appears in some ways to not be contained in the document. .offset() assumes that if the element isn't contained in the document then it's disconnected and returns an offset of {top: 0, left: 0}.

When .offset() checks to see if the element is disconnected it could try walking up the DOM not only by checking .parentNode as .contains() does but also by looking at .host, which is how one travels up from a shadow root into its containing element.

Issue reported for jQuery 2.1.1

@TeaSeaLancs
Copy link

This is also present on jQuery 1.11.1.

The main issue is that the internal .contains() function, which is used by .offset() and other places such as isHidden(), uses either the element function "contains" or, failing that, the element function compareDocumentPosition, to see whether the element in question is underneath element.ownerDocument. For elements that are underneath a shadow root, this is always false.

Unfortunately there is no built-in function or accessor which will give the shadow root for a given element, which means that traversal is currently the only guaranteed way of retrieving the host of a shadow root.

Putting this into the internal .contains() function would no doubt increase the running time of this function massively. However, I think it would be acceptable (From my own limited knowledge) to use a guardian function in .offset(), isHidden, etc, which works on the principle of:

if ((browser has shadow DOM support) && (element matches CSS selector ":host *")) { traverse to find shadow host and use that for .contains() }

which would hopefully not add too much of an overhead. Thoughts?

@timmywil
Copy link
Member

If it's an issue with contains, then it affects a multitude of jQuery methods and method chains (any that end up using uniqueSort). This seems to be a more fundamental issue that would benefit from adding tests for shadow roots across the board.

@TeaSeaLancs
Copy link

Well, okay, to elaborate, it only actually makes a difference if, out of the two arguments passed in, one is under a shadow root and the other isn't. For example, contains( elem.ownerDocument, elem ), where elem is under a shadow root.

I agree it would affect a lot of different methods, especially anything which is using contains to check whether an element is currently within the DOM, however a proper fix would hopefully be able to do something better than node traversal to find the root/host. Full traversal every time contains is called, on both arguments, would probably have a bit of a speed impact?

@dmethvin
Copy link
Member

Seems like this needs to be solved via some changes or additions to DOM methods? I can't imagine that the design of shadow root would effectively destroy the use of DOM .contains() and always require climbing the tree. Either that or the intent was indeed to hide the containment, in which case it seems wrong for us to re-reveal it (at least, not in the same method).

@timmywil
Copy link
Member

@TeaSeaLancs We could limit the performance degradation to cases where shadow roots need to be found. However, @dmethvin brings up a good point. Sometimes we just need to get browsers to fix the problem.

@TeaSeaLancs
Copy link

It's a bit of a funny thing I think. On the one hand, elements under a shadow root are interpreted as their own portion of the DOM, separate from the main body, so document.documentElement.contains(elem), where elem is under a shadow root, SHOULD return false.

Also note that the w3c spec details that for an element under a shadow root, the property ownerDocument should be set to the shadow host's value of ownerDocument (Shadow host being the element which is hosting the shadow root). I'm not sure why

On the other hand though there currently isn't any method or function specified as part of the w3c spec which acts like ownerDocument but takes shadow roots into consideration. The only thing i've found thus far which can tell us definitively whether an element is under a shadow root is by doing elem.matches(":host *") (:host being a term to represent a shadow host)

I don't necessarily think that the intention is to conceal the containment as such, as given a shadow host element, you can easily retrieve it's shadow root by using the property shadowRoot, however I think the intention IS that it's a separate DOM area, hidden from the main DOM. I get the feeling that this is a lacking part of the specification at the moment, so yes perhaps launching a browser inquiry would be a good idea.

@dmethvin
Copy link
Member

I'm going to pull @mikesherov into this for his opinion, and also to see if he could recommend a good place to raise the issue.

@ghost
Copy link

ghost commented Nov 3, 2014

I just ran into this issue. I'm going to have to agree with TeaSeaLancs, it looks like the specification needs to be rethought and updated. There doesn't seem to be a palatable fix with the current DOM methods.

@scottgonzalez
Copy link
Member

Here's a discussion from last year about this: http://lists.w3.org/Archives/Public/public-webapps/2013JanMar/0881.html

@dmethvin dmethvin changed the title .offset() thinks all elements in the ShadowDom are disconnected, and thus as 0,0 Issues using ShadowDom and ShadowRoot elements Nov 24, 2014
@nazar-pc
Copy link
Contributor

I wrote patch gh-1976 that makes $.offset() working properly, and it actually very similar to what @mzgol wrote in first comment.
The thing is that element.getBoundingClientRect() that jQuery uses under the hood works fine with such elements, the only problem we have is check whether element is disconnected from DOM or not. I didn't found better way to do that rather recursive traversal to parent.

If would be great to have some native browser function for this, but until that time we need to deal somehow with existing tools. Solution is not pretty, but it works as expected and doesn't create significant overhead for elements that are not inside ShadowDOM.

@NekR
Copy link
Contributor

NekR commented Jan 27, 2015

if (elem.createShadowRoot) {
  var rect = elem.getBoundingClientRect();
  // disconnected or hidden
  if (!rect.height && !rect.width) {
    return box;
  }

  box = rect;
} else {
  // Make sure it's not a disconnected DOM node
  if ( !jQuery.contains( docElem, elem ) ) {
    return box;
  }

  box = elem.getBoundingClientRect();
}

This is another way to fix this issue. This method does not need traversation over tree to detect if element is disconnected. getBoundingClientRect() does nothing on disconnected nodes and returns always zeroes. It also returns zeroes on hidden elements. So if element is hidden or disconnected -- we return zeroes. If it visible and in the tree (anywhere in the tree) -- then we return real box.

One problem here is that for hidden elements, jQuery currently returns box { top: 0 + pageYOffset, left: 0 + pagetXOffset } and for disconnected elements returns always { top: 0, left: 0 }.
I do not know if this artifact is useful or if someone uses it. Probably it might break compatibility.

Anyway, I will just leave it here as an alternative solution.

@NekR
Copy link
Contributor

NekR commented Jan 27, 2015

Also, as far as I see, there is no tests for such behavior. And if you think what that method will not break compatibility, then we can even drop the condition there and execute always first part.

@dmethvin
Copy link
Member

It's possible we could use something like the solution @NekR proposes. The docs for .offset() specifically say it doesn't work for hidden elements.

@nazar-pc
Copy link
Contributor

@dmethvin, in this case code above can be simplified to just box = elem.getBoundingClientRect();

@TeaSeaLancs
Copy link

I think there's a point missing here: If it was only ever as simple as @nazar-pc indicates, why does the case to check whether it's a disconnected DOM node exist in the first place?

Is it a workaround for browsers which is no longer needed? Or is it an optimisation step to prevent the needless call of a potentially expensive function?

If it's the former then sure, but if it's the latter do we not have to think about performance?

@nazar-pc
Copy link
Contributor

Thanks to PhpStorm and how it integrates git I found where that code started:
ea507b3
http://bugs.jquery.com/ticket/7190

@TeaSeaLancs
Copy link

So the question becomes does that bug still affect any of the browsers which are supported by jQuery 2.x? If not then it seems like a fairly decent solution for .offset()

I'm not sure whether the solution would work for all of the issues reported with working with elements under a shadow DOM though (Which would ostensibly be any function which uses jQuery.contains() as a method of filtering out disconnected nodes). What do you think?

@dmethvin
Copy link
Member

Actually the ticket @nazar-pc references seems to indicate that we were trying to support a return value of top/left 0 for disconnected elements. So any change should preserve that.

@nazar-pc
Copy link
Contributor

Anything that uses native functions should work fine just like with $.offset() and without redundant check will work even faster than before.
We need to check all usages of .contains() and in every particular case determine whether we need it now or not, in many cases it might be replaced by native element.matches().

@dmethvin, but that gives really big overhead to handle disconnected nodes especially with ShadowDOM support.

Maybe with performance consideration (number of elements in DOM tree with ShadowDOM might much bigger than before) we can drop support for this edge case? In my opinion it worth that, especially with new major version. Also, it will not break any well-written code (it should be something wrong with getting offset of disconnected node).

@dmethvin
Copy link
Member

Also, it will not break any well-written code (it should be something wrong with getting offset of disconnected node).

I agree with that completely. The problem is, people write bad code all the time, sometimes by accident when cases like this quietly return values that make it seem like everything is okay. I would have preferred that ticket be closed with, "Why are you getting the offset of an element not in the document? We'll update the docs to state that's not valid." It's very possible though that some code now depends on the zero return values. That's what has me concerned.

Any other opinions?

@gibson042
Copy link
Member

From an API perspective, the best behavior would be to throw when offset is meaningless (e.g., truly disconnected elements or non-elements). From a documentation perspective, the best behavior would be to exclude such elements from the range of valid input, in which case the return value is unspecified (and allowed to be cross-version/cross-branch inconsistent) but { top: 0, left: 0 } is as good as anything else.

So I'm perfectly content with @NekR's proposal, provided it doesn't introduce an inconsistency between master and compat for handling of shadow DOM elements.

Edit: added other examples of meaningless input

@nazar-pc
Copy link
Contributor

Well, guys, I have Chromium 39.0.2171.65 with native support for ShadowDOM and other stuff and guess what:

var y = document.createElement('div')
y.getBoundingClientRect()

outputs

ClientRect {height: 0, width: 0, left: 0, bottom: 0, right: 0…}

Now Firefox Nightly 38.0a1 (2015-01-25), guess output?

DOMRect { x: 0, y: 0, width: 0, height: 0, top: 0, right: 0, bottom: 0, left: 0 }

So, why do we need to check for anything here, it already gives correct zeros.

Where problem comes is var docElem = elem.ownerDocument.documentElement which is used when returning value. It will be undefined on disconnected nodes only independently whether it is inside ShadowDOM or not. It will likely be the same in any browser, so we do not need to call .contains() or whatever else.

Finally:

if (!elem.ownerDocument.documentElement) {
    return { top: 0, left: 0 };
}
box = elem.getBoundingClientRect();

Pretty straightforward. Not sure why it wasn't this way from the beginning. What you think?

@nazar-pc
Copy link
Contributor

Moreover, in .offset() there is a check for elem.ownerDocument not to be falsy, but MDN states that it can be null if only called on document itself, does that makes sense to call .offset() on document?
In all other cases check is redundant.

@NekR
Copy link
Contributor

NekR commented Jan 27, 2015

@dmethvin wrote:

Actually the ticket @nazar-pc references seems to indicate that we were trying to support a return value of top/left 0 for disconnected elements. So any change should preserve that.

That ticket removes scroll positions for disconnected nodes, but not for hidden nodes. If we will use only getBoundingClientRect() then it also will remove scroll positions for hidden nodes. Seems correct to me that hidden elements always should have zeroes too.

@nazar-pc wrote:

Well, guys, I have Chromium 39.0.2171.65 with native support for ShadowDOM and other stuff and guess what:

Did not get your comment. You mean for attached and visible nodes it always returns zeroes? If so, then very strange because for me on Chrome 40 it works correctly.

To summarize:

  • element.contains/compareDocumentPosition does not work with ShadowDom as we expect -- cannot match nodes in ShadowDom
  • element.getBoundingClientRect does work as expected (at least at Chrome 40) and returns zeroes (top, left, bottom, right) for hidden and disconnected element regardless of that node is in Shadow or Light dom

@nazar-pc wrote:

Where problem comes is var docElem = elem.ownerDocument.documentElement which is used when returning value. It will be undefined on disconnected nodes only independently whether it is inside ShadowDOM or not. It will likely be the same in any browser, so we do not need to call .contains() or whatever else.

Finally:
...

If it works then it's very nice way, I am just not sure about IE8 because him disconnected nodes are not really disconnected. Does anyone have IE8 to test it?

@nazar-pc
Copy link
Contributor

Did not get your comment. You mean for attached and visible nodes it always returns zeroes? If so, then very strange because for me on Chrome 40 it works correctly.

No, in example next to that line you can see that it returns zeros on disconnected node.

Does anyone have IE8 to test it?

Does jQuery 3.0 will support this very old browser? Or you meant compat version?

@NekR
Copy link
Contributor

NekR commented Jan 27, 2015

No, in example next to that line you can see that it returns zeros on disconnected node.

Oh, yes, I overlooked that, sorry. This is exactly why I proposed use of only getBoundingClientRect() without traversing dom tree -- because it works.

Does jQuery 3.0 will support this very old browser? Or you meant compat version?

I really do not know about which version we talk here, but for me it seems like both versions are broken in ShadowDOM, so both should be fixed. If I am wrong, please correct me.

@dmethvin
Copy link
Member

API-wise, jQuery 3.0 and jQuery Compat 3.0 are the same. The only difference is the browsers they support, IE8 being the odd one of course. We'd want the API to work (have consistent results) in the latest Chrome if a site is using either branch.

There definitely needs to be some IE8 experimenting done here, because calling .getBoundingClientRect() on a disconnected element throws an error. Not sure whether a tree climb is more expensive than a try/catch wrapper, I really dislike try/catch though.

capture

NekR added a commit to NekR/jquery that referenced this issue Jan 28, 2015
@gibson042
Copy link
Member

Honestly, I'd prefer throwing an error instead of returning inaccurate { top: 0, left: 0 } results—and we could do so with abandon if disconnected elements are excluded from valid input.

NekR added a commit to NekR/jquery that referenced this issue Jan 28, 2015
Add guard for ```createShadowRoot``` existence and return early if
user-agent does not support ShadowDOM and elem is disconnected

Fixes jquery#1784
@NekR
Copy link
Contributor

NekR commented Jan 28, 2015

I am making now small steps to resolve this issue in branch of my fork (Github already showed it here).
I will download soon WinXP virtual machine and will test all the things on IE8 to find correct solution.
In the mean time, tests fails for me with or without my changes. Here is screenshot of these errors:
Now ok, no more getBoundingClientRect errors.
screenshot - jquerytestcom_28012015_23016

Honestly, I'd prefer throwing an error instead of returning inaccurate { top: 0, left: 0 } results—and we could do so with abandon if disconnected elements are excluded from valid input.

Hmm, I think this might broke compatibility since jQuery never throwed errors there.

@NekR
Copy link
Contributor

NekR commented Jan 28, 2015

Unfortunately I cannot run tests on IE8, even if I checkout "compat" branch.
But I tested elem.ownerDocument.documentElement and it's not the case for IE8 because this property exists even for disconnected nodes.

As you may see here https://github.com/NekR/jquery/compare/1784-fix-offset-in-shadow-dom?expand=1 I use this condition:

if ( !elem.createShadowRoot && !jQuery.contains( docElem, elem ) ) {

which uses contains only if user-agent does not have ShadowDOM. This excludes older browsers such as IE8 from being called with getBoundingClientRect on disconnected nodes.

Questions:

  • Should I make pull request right now or we need discuss it more?
  • How it will be merged with "compat" branch, should I fix it in compat branch by myself or you do it by merging?
  • Of course I should write tests for it, but do we need tests only for ShadowDOM or for disconnected/hidden elements too (make sure they always return { top: 0, left: 0 } rect)?

@gibson042
Copy link
Member

Hmm, I think this might broke compatibility since jQuery never throwed errors there.

It doesn't violate our currently-documented API; http://api.jquery.com/offset/ already includes "jQuery does not support getting the offset coordinates of hidden elements". And even if that were not the case, backwards-incompatible changes are allowed with the upcoming major version bump.

if ( !elem.createShadowRoot && !jQuery.contains( docElem, elem ) ) {

I don't like this conflation of unrelated behaviors.

Should I make pull request right now or we need discuss it more?

Please make a pull request; further discussion can take place there.

How it will be merged with "compat" branch, should I fix it in compat branch by myself or you do it by merging?

Our general guideline is that a separate PR is required only when the solution looks sufficiently different between the two branches that cherry-pick is impractical. For this ticket, one will be enough.

Of course I should write tests for it, but do we need tests only for ShadowDOM or for disconnected/hidden elements too (make sure they always return { top: 0, left: 0 } rect)?

That depends on whether or not we want to guarantee such return values. For now, limit new tests to Shadow DOM.

@nazar-pc
Copy link
Contributor

There are tests for ShadowDOM in my fork

@NekR
Copy link
Contributor

NekR commented Jan 29, 2015

It doesn't violate our currently-documented API; http://api.jquery.com/offset/ already includes "jQuery does not support getting the offset coordinates of hidden elements". And even if that were not the case, backwards-incompatible changes are allowed with the upcoming major version bump.

Ok, I have no problem with it. Just wondered about possible problems. If you think it will not be so much problem then it's probably ok.

Other answers in PR.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Development

Successfully merging a pull request may close this issue.

9 participants