326 private links
TL;DR js solutions is often better for accessibility. At least information is conveyed.
Popover will be more useful than ever.
The tradeoff is currently the <details> tag with two limitations: the element does not announce a navigation menu is exposed; clicking outside or pressing Esc does nothing.
The author of Kelp use public and private cascade layer.
Public layers are: theme, extend, overrides and effects.
The unit lh can be helpful to adjust the distance of the underline from the text.
Centering text and containery
Load desktop styles only when needed and design for mobile first.
I can implement this strategy on my website to test it.
Different underlines for links: the regular underlined, the underline with an offset, the thin underline, the translucent underline or the dotted underline.
I thought of :focus-within and got it right. The author shares a JS snippet and a method with the new :has() selector.
How much do you know out of the basic CSS rules?
The stats are one thing.
The comments are another.
And Piccalili shares it https://piccalil.li/links/the-state-of-css-2025-results-are-in/
Modular CSS or a bundles? It follows Rethinking modular CSS and build-free design systems.
On first load, modular css files are worse.
Once the files are cached, subsequent renders take just 100ms to 200ms slower with modular files compared to one bundled file.
Given that the guiding ethos of Kelp is that the web is for everyone, it looks like I should probably be encouraging folks to use a bundled version as the main entry point.
How to handle CSS color with the new CSS color functions instead of SASS?
The best seems to be color().
There is also rgba(), hsla() and color-mix()
- color()
It’s evident that there is a growing array of new CSS capabilities that can handle many tasks we previously relied on Sass for.
Yes indeed. Also I agree: it's a per project decision to make.
How to build a mansory layout that works. 66 lines of JS.
CSS builds results in faster loading times.
Buildless CSS has also advantages.
But the @import nesting must be avoided.
The author focus is on https://kelpui.com/docs/tools/install/
Get what you want of the CSS library and it spits out the import
font-size-adjust: ex-height 0.5 instructs the browser to scale the font such that the letter "x" is exactly half of the box.
A use-case for font-size-adjust I find much more compelling is that you probably are going to use several fonts on a web-page. And you also might change fonts in the future. And they will have different intrinsic size because that’s how the things are. Part of the mess is avoidable by pinning the meaning of font size.
0.53 is the invariant ratio for Helvetica.
I totally agree.
- When there is more than one text directionality
- When the respective expression would be shorter than the non-logical equivalent.
The second point is healthy for every project.
Instead of BlurHash that needs aditionnal JS for 83 bits string, some CSS snippets can do the work for Low Quality Image Placeholders (LQIP).
The big disadvantage of pure CSS approaches is that you typically litter your markup with lengthy inline styles or obnoxious data URLs.
A blurHash like method in CSS can also be used and that's what the author is describing.
A more simpler solution is to use one color as placeholder.
tl;dr: the issue isn’t the @import rule itself, but that files under 1kb often end up the same size or even bigger when gzipped, so you get no compression benefits.
The experience shows that atomic css files is not optimal.
If the files I was importing were larger, it might make sense. As tiny, modular files? Not so much!
The complete library concatenated and gzipped is less than a single HTTP request. It’s just over 25-percent of the transfer size of sending modular gzipped files instead.
There are many methods that break performance and accessibility such as dynamic css classes and div and span soup.
Topics adressed from the outline:
- It’s not just bad HTML – it’s meaningless markup
- Semantic rot wrecks performance
- Big DOMs are slow to render
- Complex trees cause layout thrashing
- Redundant CSS increases recalculation cost
- Autogenerated classes break caching and targeting
- Animations and the compositing catastrophe (with properties triggering the layout engine)
- Autogenerated classes break caching and targeting
- Semantic tags can provide layout hints
- (AI) Agents are the new users and they care about structure
- Structure is resilience, isn't optional
Likewise,
content-visibility: autois one of the most underrated tools in the modern CSS arsenal. It lets the browser skip rendering elements
that aren’t visible on-screen – effectively “virtualising” them. That’s huge for long pages, feeds, or infinite scroll components.
and I didn't know about the contain CSS property.
contain: layout;tells the browser it doesn’t need to recalculate layout outside the element.
will-change: transform;hints that a compositing layer is needed.
isolation: isolate;andcontain: paint;can help prevent visual spillover and force GPU layers.
@view-transition {
navigation: auto;
}
::view-transition-old(root),
::view-transition-new(root) {
animation: fade 0.3s ease both;
}
@keyframes fade {
from { opacity: 0; }
to { opacity: 1; }
}
To animate a thumbnail between two pages, the <img src=""> tag can be set to view-transition-name: product-image;
This is a microcosm of a much bigger theme: browsers are evolving to reward simplicity and resilience. They’re building for the kind of web we should have been embracing all along. And SPAs are increasingly the odd ones out.
SPAs were a clever solution to a temporary limitation. But that limitation no longer exists.
and I agree, SPAs are overkill for 95%-99% of the websites.