Announcing our latest Bug Bounty connector: Synack. With the new Nucleus / Synack Connector, the gap between vulnerability management and crowdsourced security testing is much smaller. You can now easily (and automatically!) inject Synack sourced security testing data into your vulnerability management process so that you can manage both sets of data in the same VM process.
Remember that connector builds are based on customer requests, so be sure to let us know if there are connectors that you’d like to see built, and hit that subscribe button to see when they’ll be coming to your Nucleus instance.
Editing Custom Findings After Creation
We’re committed to improving our ability to deliver on application security and penetration testing workflows for our customers, so in this release, we’ve added the ability to do something that a lot of you have been asking for, which is to edit ALL fields of custom findings once the custom finding has been created.
Previously if you wanted to update Port/Service information for a custom finding, you would have to create a new instance of the custom finding, forcing you to copy and paste data from another instance. But now you can keep your finding record intact while changing the location of the custom finding instance itself.
We’ve continued to make it even easier to extract the specific information you are looking to use in other systems with updates to filtering on both the Active Vulnerabilities and Asset Management pages. One place of note, in particular, is that you can now use a custom date selector to filter the discovered and last seen dates for vulnerabilities on the Active Vulnerabilities page, giving you more targeted metrics for reporting purposes.
For example, do you want to know which vulnerabilities were discovered in your environment before the Mayan calendar ended?
Or how about vulnerabilities that haven’t been scanned for since you last moved out of your parents’ house? We’ve got you. Just use the new custom date range filter.
In this release, we’ve made asset merging a little easier for all. Now when two assets are merged in the UI, we’ll automatically update the new merged asset to include secondary matching information from the asset that was merged in. This means that after two assets are merged, future asset ingests won’t result in the old assets being created again!
This release comes with a slew of improvements to existing connectors – there’s a little of something for everyone.
Some highlights include being able to import all containers, hosts, and deployed images in one go from Prisma Cloud as well as setting additional metadata on assets from Bug Crowd. Scroll down to see a full list of connector changes.
Complete list of changes and bug fixes…
NEW There is a new Synack connector for ingesting bug bounty vulnerabilities.
UPDATE You can now use a custom date selector to filter the discovered and last seen dates for vulnerabilities on the Active Vulnerabilities page.
UPDATE The Bug Crowd connector now sets additional metadata on assets.
UPDATE Now when assets are merged the merge is permanent by default. Secondary matching information is updated to include primary information (such as asset name or IP address) from non-primary assets automatically unless disabled during the merging of the assets.
UPDATE The Prisma Cloud connector has been updated to ingest at a much faster rate with more input from users on what specifically to import.
UPDATE The vulnerability details excel report has been updated to include the Asset Owner field on the Scan Data tab as well as the vulnerability’s exploitability and user comments on each Severity tab.
UPDATE You can now identify vulnerabilities that already have comments from the Active Vulnerabilities page.
UPDATE Custom finding instances on device assets can now be edited to change the service or port after creation.
UPDATE Miscellaneous optimizations to improve the speed of automation rules and asset counting.
UPDATE Asset search on Asset Management page filtering now allows for special characters.
UPDATE Qualys WAS Scan ingestion now includes setting the HTTP request body if provided.
UPDATE You can now specify regions for vulnerability ingest rules for the AWS connector.
UPDATE Improvements to the speed of asset synchronization and vulnerability ingestion for the AWS connector.
UPDATE Ingestion of vulnerabilities from Rapid7 InsightVM and Nexpose now also set the vulnerability’s exploitability based on additional criteria from Rapid7.
UPDATE When ingesting OWASP Dependency Check scan files, an Informational finding for files with no vulnerable dependencies is no longer created.
UPDATE Extended support for additional columns in Alertlogic scan files.
UPDATE The Nucleus Custom Finding JSON file now supports setting exploitability as a boolean value in addition to a string.
BUG FIX Filtering for an unknown operating system in the Asset Management page now also includes operating systems that are set as Unknown.
BUG FIX The Assetnote connector now links to the correct support page.
BUG FIX In limited situations vulnerabilities ingested from Assetnote would not set the instance path.
BUG FIX Improvements to the way that dynamic fields are applied to asset groups in asset processing rules.
BUG FIX In limited situations the vulnerability description and recommendation for Sonatype NexusIQ vulnerabilities was not comprehensive.
BUG FIX In limited situations container images ingested from Prisma Cloud would have empty brackets appended to the container path.
BUG FIX The Sonatype NexusIQ connector no longer allows for importing of unsupported scan types.Read More
We’re chuffed to announce the next update to our Amazon Web Services connector, Amazon ECR! With the release of Amazon ECR support, you can synchronize container image repositories from your AWS accounts, as well as ingest vulnerabilities that have been found.
To get started simply add the Amazon ECR Managed Policy AmazonEC2ContainerRegistryReadOnly to any cross-account roles that you’ve setup in the AWS connector. You’ll then be able to use your existing AWS connectors to synchronize repositories by creating Asset Sync Rules, and to ingest vulnerabilities by Importing via Connector or creating Vulnerability Ingest Rules.
When you’ve ingested data in to your project, you’ll immediately notice that all additional asset metadata is set for the asset, which can be further used in the Nucleus Automation Engine. You can also swap between different images within the same repository.
“Great things are not done by impulse, but by a series of small things brought together” – Vincent van Gogh
In this release we’ve made a small change to the way that filtering works on the Asset Management page, and we think you’re going to love it! Now, if you apply filters, click on an asset to view it, and then click back to the Asset Management page, the filters you applied will persist!
NEW The Amazon Web Services connector now has support for synchronizing Amazon ECR repositories and ingesting vulnerabilities.
NEW Asset filters now persist in the asset management page after navigating to individual assets and back.
BUG FIX In a limited set of situations imports from Qualys via group would not run.
BUG FIX In certain situations invalid asset criticality’s could be set.
BUG FIX In Safari some text would not fit the screen.
BUG FIX Some asset group names would impact asset category filters on the Asset Management page.
BUG FIX In certain situations scans could not be imported by query from Tenable.sc.
BUG FIX Adding assets to custom findings now works as expected.
BUG FIX Importing images from PrismaCloud without registries set now omits the registry.
BUG FIX Dates for vulnerabilities now always reflect the local timezone.
BUG FIX In certain circumstances importing scans from AWS services resulted in an unhandled error.Read More
Let’s paint a picture. It’s a bright sunny day in Florida and the Nucleus Ninja is out walking his Ninja-dog, Ninken, next to the local alligator pond. This gives the Nucleus Ninja a chance to clear his head and think about our customers, and that’s when it hits him, the dawning realization that may just be the answer to “how to do automation in the context of vulnerability management”.
This release is all about how to start doing automation of vulnerability management at scale. We announced our push towards better automation in our first Quarterly Customer Webinar, and this release is all around going down that path. We’ve designed a brand new workflow, introduced a templating language, and updated a lot of areas to make Nucleus just that much better around automating workflows. And without further ado, here’s what we’ve been up to the past month…
Last release we announced Vulnerability Processing Rules, a new feature in the Nucleus Automation Engine that makes it easy and efficient to set due dates on vulnerabilities in line with the security policies in your organization. At the time, we promised more functionality coming soon to help you automate as much of the vulnerability analysis and tracking process as possible.
Today we’re excited to announce an extension to these rules using our new “ninja-approved” Action Card view. Now, in vulnerability processing rules you can not only trigger actions on more flexible data criteria, but you can also choose between a wider set of actions to perform on vulnerabilities when they are ingested into Nucleus. All actions can be filtered to only apply to subsets of assets for maximum flexibility and scalability in your automation ruleset.
Some of the new actions available include:
What’s even more exciting is that you can do all of the above actions in one rule. Simply create a new rule, choose the vulnerability and asset criteria, and add action cards to your heart’s content! We think that this can be particularly useful for actions that are specific to your organization, such as changing the vulnerability’s severity, assigning it to a user and adding an explainer comment all in one go, completely automatically as new vulnerability data comes into Nucleus.
here’s what it looks like to add a bunch of actions
We’ve also managed to sneak into this release another new action in Asset Processing Rules: Asset Owner. This adds to the existing set of actions available when setting asset processing rules as new asset data is ingested into the Nucleus Asset Inventory, which still includes setting asset groups and risk attributes. Stay tuned as we convert this workflow to the new action card view in the coming months to make it just that much easier to do the second hardest part of Vulnerability Management, which is knowing what assets you have.
This was the “Eureka” moment that allowed the Nucleus Ninja to come to the simple following conclusion
“If I could just use asset fields dynamically in automation rules, I could have one rule to… rule them all!”
Well, it’s time to get back to your computer and fire up your browser, because Nucleus has got you covered! Introducing Dynamic Fields, a templating language for the Nucleus Automation Engine (and soon to be app-wide, or world-wide, depending on how you look at it).
Dynamic Fields allow you to construct asset and/or vulnerability processing rules that dynamically include information from the assets that the automation rules match during execution of the rule. For example, let’s say you want to automatically assign a vulnerability to a user based on the asset owner. That is now totally possible! Say goodbye to multiple rules for every possible value of a custom metadata field. You can create ONE rule to undertake multiple actions and dynamic values based on other attributes from elsewhere in Nucleus.
In this release, you can use asset fields dynamically in vulnerability processing rules when commenting on a vulnerability or assigning a vulnerability to a user, and in asset processing rules when adding an asset to an asset group or setting the asset owner.
Here is a complete list of the asset fields that you can use dynamically in these automation rules:
We didn’t stop there! Since you can use custom metadata from your extended asset record in Nucleus, we’re now always continuing to update existing connectors to include more additional metadata so you can use the data for better automation and reporting. In this release we’ve made updates to the Sonatype NexusIQ and Prisma Cloud connectors.
As always, you can use all of the above metadata in the Nucleus Automation Engine to make more and more powerful automations in your Nucleus Projects.
At Nucleus, one of our guiding principles is to listen to our customers and build functionality that makes the vulnerability management lifecycle as quick and painless as possible so that more time can be spent on high-value tasks.
True to this principle, we’ve previously released two powerful connectors that integrate with Amazon Web Services (AWS), providing customers with vulnerability data identified by AWS Inspector, as well as synchronization with AWS EC2 instances so that you can keep on top of what your attack surface actually looks like. Since then we’ve been listening to your feedback on how you use these connectors so that we can make them even better than they already are.
Today we’re excited to announce a brand new connector called the Amazon Web Services connector which integrates our two previous connectors into a single authentication flow and will serve as the foundation for scaling out support for more AWS services in the future cough ECR is next cough cough. This connector is the latest step in our push towards making the ingestion of cloud asset and vulnerability data a quick and painless task.
With this release, the AWS connector becomes a single place for you to set up and manage access to all of your AWS accounts in your Nucleus project by leveraging cross-account IAM roles. IAM roles can be created directly in the AWS Console, or deployed using CloudFormation which we’ve provided a handy CloudFormation template for.
Once roles are deployed to your AWS accounts, you can then add role ARN’s directly in the connector setup page and Nucleus handles the rest! For more information on setting up the new AWS connector as well as the CloudFormation template, see our help documentation here.
You can now manage a single Asset Inventory Sync rule for all of your AWS accounts across all regions in your Nucleus project. Simply go to the Asset Inventory Sync tab of the Automation page and click Add Rule. Select your AWS connector, the regions, and accounts that you want to synchronize instances from, and hit Save & Finish.
The synchronization rule now also ingests all available metadata as Additional Metadata which can be viewed under the Asset Details page, as well as leveraged to construct powerful rules in the Nucleus Automation Engine:
The AWS Inspector integration has been turbocharged with new functionality and flexibility. You can now import vulnerability results by Scan, Target, or Template, as well as select the regions that you want to query and ingest data from:
Once you’ve chosen the import method and regions, you’re then presented with all available results across each account that you’ve set up in the connector, and can further filter by region and other information:
Please note that with this release we’ve deprecated the existing EC2 and Inspector connectors, as well as authentication via IAM access keys. They will be continued to be supported for existing customers during the transition period, but no new features will be released for the previous AWS EC2 or Inspector connectors.
We’ve been working hard behind the scenes to make the Nucleus Automation Engine even better and provide more flexibility and scenarios to trigger the automation workflows! In this release we’ve updated the Vulnerability Processing, Ticketing & Issue Tracking, and Notifications rules so that they can be triggered based on more vulnerability conditions, such as:
For more information on the complete list of new triggers, check out our help center.
As organizations mature their vulnerability management programs, it has become increasingly common (and necessary) to set and track Service Level Agreements (SLAs) for how and when a vulnerability is treated. This can be particularly helpful in large organizations, where security policies define expected remediation effort and timelines for different classes of vulnerabilities in different situations.
In this release we’re introducing our first foray into SLAs, Vulnerability Due Dates. Vulnerability Due Dates allow you to set when remediation efforts on vulnerabilities must be completed by, and track and report on vulnerabilities that are approaching their due date or have exceeded that date.
Using the Nucleus Automation Engine, you can create vulnerability processing rules which, based on all of the available vulnerability and asset criteria, enable you to automatically set due dates for vulnerabilities. Rules can be configured to set the due date as a set number of days, weeks or months from the time of ingestion, or the vulnerability’s discovered date.
Once set, you can identify and measure vulnerabilities in the Active Vulnerabilities page by due date, including whether a due date is not set, when a vulnerability is due within days, weeks or months, and when vulnerabilities are overdue!
This release we’re excited to announce a new connector with Assetnote! Assetnote is an Attack Surface Management platform that identifies an organisations external facing systems and continuously monitors those systems for exploitable vulnerabilities.
We’ve worked closely with the team at Assetnote to create a connector that integrates with the Assetnote Notification Pipeline so that when assets and vulnerabilities are discovered across your environment, they surface in to your Nucleus project(s) in real time. When combined with the Nucleus Automation engine, you can create powerful and intelligent rules to suit your specific use case.
Read about setting up the Assetnote connector here.
The Active Vulnerabilities page has received a face lift to ensure that you are able to identify and track the vulnerabilities that matter most. We’ve introduced an updated Quick Filter pane at the top of the page that shows you rolled up numbers of vulnerabilities based on different tracked metrics. Using these filters, in one click you can drill down to the vulnerabilities that are most important to you.
We've also added the ability to multi-select vulnerabilities on the Active Vulnerabilities list, as well as a Modify menu to bulk update attributes of vulnerabilities. Currently we only support setting due dates in bulk, but you should be on the lookout for other actions such as setting: severity, status, and exploitability in bulk in the future as well.
Additionally, we’ve also updated the Source column with vulnerability source tool icons to make it easier for you to quickly identify where a vulnerability came from.
We added a Certificate Summary view to the Assets menu to make it easy to view and report on certificates. This view includes all the usual filters for quick drilldowns. Plus, you can export to a downloadable report in one click.
Phew. It’s been a great start to the year here at Nucleus Security with another release jam packed full of new functionality. See below to find out more!
In this release we have turbo-charged our ticketing automation functionality so you can get more out of ticket workflow management.
Tickets that have been raised using one of our ticketing connectors are now responsive to changes in the vulnerability source tool. For example, when a new instance of a vulnerability that has been previously raised in an open ticket is found, that existing ticket will be automatically updated with new information. What’s more, you can optionally have a ticket close in the downstream system when it’s been identified as remediated in Nucleus!
It’s now also easier to retrospectively run ticketing rules over existing data sets. This means that if you decide to turn on ticketing in a Nucleus project later down the line, you can raise tickets against existing vulnerabilities that match your ticketing rule at the tap of a button.
Finally, where supported within the ticketing system, Nucleus will automatically upload a csv file containing all of the affected assets for easier data export and parsing by support teams. We hope that this change will make it simpler to remediating vulnerabilities for technical teams.
We’ve made a change to our notifications section which we hope you’ll agree, make a lot more sense: we’ve moved the automation rule configuration for chat connectors to the Notifications section within Automation, rather than Ticketing & Issue Tracking.
We’ve also released a brand new connector for Microsoft Teams. This one has been asked for by a lot of customers, so if you haven’t yet had a chance to check it out, do so today!
There are a few improvements to the Nucleus Custom File Schema, making it easier to get asset and finding data in to Nucleus:
The API has also been updated to return a container image’s tag, repository URL, digest and distro when querying for assets.
This release we’ve introduced improved support for Multi Factor Authentication (MFA) by enabling the use of TOTP tokens for users. User can now configure a TOTP token by navigating to their User Profile, selecting the 2-Factor Auth tab and following the steps to set it up with their app of choice (e.g. Google Authenticator).
The Tenable.io connector now supports ingesting by asset tag and network in addition to the existing ingestion by scan functionality. This update makes the connector far more flexible, as you can now ingest large volumes of data across different scan types using a single tag.
The Tenable.sc connector has similarly been updated to also be a host-based connector. In addition to importing by asset, this connector can now leverage Queries to import vulnerabilities in to a Nucleus project using custom logic that is defined in your instance of Tenable.sc.
Both connectors have also been updated to improve the speed of vulnerability ingestion, and to ingest any additional asset information as Additional Metadata, which can be used as asset criteria in Automation rules.
Note: Tenable has decommissioned the API’s which are used for ingesting by scan in both Tenable.io and Tenable.sc. Nucleus will continue to support ingesting by scan until these scan API’s have been removed. Consequently we highly encourage customers to migrate existing vulnerability ingestion automation rules to leverage one of the new ingestion methods.
The SonarQube and SonarCloud connectors have both been updated to allow for more configurability on import. Now when setting up the connector, you can choose which types of findings (vulnerabilities, security hotspots, bugs and/or code smells) to import in to Nucleus.
We’ve also updated the connectors to ingest far more data in to Nucleus:
We’ve made some minor updates to the Qualys WAS connector to improve the speed of import. The connector now also ingests CVSSv3 scores for each vulnerability where available.