Sources

1577 sources collected

GitHub Actions has grown massively since its release in 2018; in 2025 alone, developers used 11.5 billion GitHub Actions minutes in public and open source projects, up 35% year over year from 2024. At the same time, this has not been without its growing pains, and you’ve made clear to us what improvements matter most: faster builds, improved security, better caching, more workflow flexibility, and rock-solid reliability. … ... This was not without its share of pain; it slowed the pace of feature work and delayed progress on long-standing community requests. ... ### Larger caches for bigger projects and dependency-heavy builds Repositories can now exceed the previous 10GB cache limit, removing a long-standing pain point for teams with large dependencies or multi-language monorepos. For teams with larger codebases or complex build pipelines, the old 10GB GitHub Actions cache limit often meant build dependencies were evicted before they could speed up your next workflow run, leading to repeated downloads and slower builds. This release was only possible due to our architecture rework and fulfills a request from the community, particularly among some of our largest users. … ### More performance and platform improvements shipped in 2025 We also made progress on the strong foundation laid earlier this year, including arm64-hosted runners for public repositories, macOS 15 and Windows 2025 images (now generally available), Actions Performance Metrics (also generally available), and Custom Image support in public preview. These releases are designed to improve day-to-day workflow quality and remove long-standing friction.

12/11/2025Updated 3/26/2026

It was also interesting to hear the other side of the story, especially on why things took too long (big internal infrastructure migration at GitHub) or why it feels like the product is not being improved (we both agreed that GitHub should improve their communication). ... Before I rant about GitHub Actions, I'd like to set the context on where this dissatisfaction comes from. My team consists of about 15 engineers constantly pushing to the main branch. Our code lives in a monorepo split per module, which, through trunk based development, gets deployed multiple times a day. … `web-app1` folder. So if my pull request only made changes in `api1` I will never be able to merge my pull request! 🤯 In these two GitHub threads 1, 2 you can see the impact. The bottom line is, that working around this limitation is hacky, difficult to maintain, and costly since you have to run additional pipelines just to determine if a pull request can be merged or not! GitHub should not rely on specific names for the required checks. They can just say - all checks have to pass before you merge. That way, whatever pipelines and checks your pull requests have triggered will be considered mandatory. It's been almost 3 years since these issues were raised, and nothing changed yet! My impression is that when your pipeline grows, it becomes more and more difficult to manage it with GitHub Actions. Here is an example workflow that can be called from other workflows, can be triggered manually, and triggers when someone pushes to the master branch. … What I notice doing more and more is that I have to add lots of if statements such as this. ` if: ${{ github.event_name == 'push' || inputs.target_environment == 'production' }}` While I could split this into multiple workflows with different triggers, then I'm getting more and more files to maintain. A workflow reuse should be a one-liner but I always have to write more lines and a lot of duplicated statements such as workflow name, `secrets: inherit` etc. Our `.github` folder already contains 30+ files. Another pitfall that happens often is the `needs` clause. When you are refactoring and removing jobs, it's easy to forget to update this clause and subsequent steps. While there are linters available, they are not perfect. The sad part is that I can only see mistakes when I push the workflow, I'd expect to know about this way earlier. … Of all the pain points, this is the worst one. It seems that GitHub doesn't care about fixing any of these issues or improving its product. Some of the threads have been open for years without any action taken by GitHub. A lot of these issues have been recently closed by GitHub causing a backlash from the community. There are no signs that these will be addressed based on their public roadmap. Considering all the problems listed, with the lack of motivation from GitHub I'd think twice before using GitHub Actions again. The CI/CD product space offers a lot of options such as Gitlab, Jenkins, TeamCity, etc.

1/30/2025Updated 1/26/2026

## My problem with Python Like all programming languages, Python has its shortcomings, which I can easily mention. These flaws become obvious when I compare it to other programming languages I use: - It’s slow (not compiled) - Has a huge dependency on C for heavy tasks - Possible to create desktop applications, but inappropriate - Has no built-in linear algebra functionality - No integrated dataframe management functionality - No integrated statistics or machine learning functionality - Object-oriented doesn’t integrate well with any of the above functionalities (Numpy and Pandas for instance feel weird to use personally)

1/7/2024Updated 10/6/2025

### Dealing with Library Overload The community of Python is very much alive as new libraries and modules are being created by developers constantly. The consequence of this is that it becomes challenging for the developer to determine which library would best fit their project due to a large number of choices available. In many cases, developers have to consider documentation, community support, overall maturity as well as stability of the library. ### Keeping up with Rapid Ecosystem Changes The Python ecosystem is constantly evolving, with new versions of the language, libraries, and frameworks being released regularly. Staying up-to-date with these changes can be a daunting task for Python developers. Keeping track of breaking changes, deprecations, and the latest features can be time-consuming and requires a significant investment of effort. ### Managing Compatibility and Version Conflicts The interdependency between libraries and the rapid evolution of the ecosystem can lead to compatibility issues and version conflicts. Developers may encounter situations where a specific library or framework they need is not compatible with the version of Python or other dependencies in their project. Resolving these conflicts can be a complex and time-consuming process, often requiring extensive research and troubleshooting. … Novice developers, in particular, may struggle to understand these messages and find the appropriate solutions, leading to a time-consuming and frustrating debugging process. Moreover, debugging asynchronous and concurrent code in Python presents its own set of challenges. Asynchronous programming features like asyncio and multithreading can introduce complexities such as race conditions and deadlocks, making it harder to identify and resolve issues. … ### Balancing Technical Debt and Innovation As Python-based applications grow in complexity, technical debt can accumulate, making it increasingly difficult to introduce new features or make updates. Developers must strike a delicate balance between addressing technical debt and introducing innovative solutions to meet evolving customer demands.

6/25/2024Updated 3/8/2026

## 1. Performance Issues **Challenge: Python is an interpreted language, which can lead to slower execution times compared to compiled languages like C or Java. This performance issue becomes particularly noticeable in applications that require heavy computation or real-time processing.** **Solution: To address performance issues, developers can use various optimization techniques:** - **Profile Your Code: Identify bottlenecks by using profiling tools likecProfile orline_profiler. These tools help pinpoint the exact sections of the code that are slowing down execution.** - **Optimize Algorithms: Review and improve the efficiency of your algorithms. Sometimes, a more efficient algorithm can significantly reduce execution time.** - **Use Built-in Functions and Libraries: Python’s built-in functions and standard libraries are often optimized for performance. Utilize these instead of writing custom code.** - **Leverage C Extensions: For performance-critical sections, consider usingCython or writing extensions in C. These can significantly boost performance by compiling Python code to C.** … ## 2. Managing Dependencies **Challenge: Managing dependencies in a Python project can be tricky, especially as the project grows. Different environments and dependency versions can lead to conflicts and compatibility issues.** **Solution: Effective dependency management can be achieved through:** … ## 4. Handling Large Codebases **Challenge: As projects grow, the codebase can become unwieldy and difficult to manage. This can lead to issues with code maintainability, readability, and collaboration among team members.** **Solution: Implement strategies to manage and maintain large codebases effectively:**

7/16/2024Updated 3/14/2026