Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

With very large lists, the DOM updates slowly when filtering up (clearing the query). #13

Open
brianlittmann opened this issue Apr 3, 2013 · 6 comments

Comments

@brianlittmann
Copy link

In my case, I have a list of 8000+ elements. Filtering down works fine, but when clearing the filter (or removing most of it), the function finishes quickly but the DOM takes too long to update, often freezing the page.

A simple solution is to detach the elements before changing the display property for each and appending them afterward.

var detached_lis = lis.detach();
for (var i = 0; i < len; i++) {
  ...
}
detached_lis.appendTo(list);
@brianlittmann
Copy link
Author

Note: This works in Chrome (26), but appears to have adverse effects in Firefox (19) and makes IE 9 unresponsive. I probably shouldn't be filtering 8000+ elements anyway...

@mattjacobson
Copy link

I am having the same issue with a smaller list size (noticeable hangups filtering ~1.5k entries). Admittedly that is still a very large list to be filtering client side.

The weird thing is that forward filtering (that is to say typing an entry) works great, but erase all of the characters in the search box and it all lags hard.

My javascript skills are a bit lacking, but I had a thought. Why can't we have an external condition that intercepts the loop. Pseudocode thoughts as follows:

on-change {
if(filter is empty){
reset all elements to visible
}
else {run the rest of the loop}
}

Have any thoughts? (will it work / anyone willing to help me code that bit up)?

fast-filter-console results

@brianlittmann
Copy link
Author

That's sort of a one-off solution. After you've filtered down to "gold", what happens if you delete all but one character, leaving "g"? Is it still slow? 1.5k elements shouldn't be too troublesome, as the demo has over 2k.

Also, your actual search time is quite slow. For me, the search itself didn't take too long, it was updating the DOM that was taking forever.

@mattjacobson
Copy link

Truthfully I am not sure what is causing the hold up, but certainly deleting all of the characters at once (ie. clearing the filter) seems to hold things up quite a bit.

I ran a couple tests and can attach the following screenshot:
fast-filter-console results2

It seemed to move decently fast when incrementing by one character and even works great going from empty to "gold" directly. But move from "gold" to "" and the entire browser window hangs.

On second look, I realize that there is no problem with the example page, and I am beginning to think this is my implementation.

It's just very baffling to me.

@leosok
Copy link

leosok commented Apr 17, 2013

I have exactly your problem! I have 8109 entries and the "forward" works perfekt on both crome and firefox, but "backwards" it kills chrome/ lets firefox take some long break. It would be awsome if someone could sort that out. Here's a fiddle with my original data:

http://jsfiddle.net/Wp8bJ/

@leosok
Copy link

leosok commented May 23, 2013

Any Idea, if there is a fix?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants