In this case below, it will scrape the published time, which is shown in the source and rendered HTML previews after selecting the ‘content’ attribute.įor those of you that have mastered XPath, CSSPath and regex, you can continue to input your expressions in the same way as before.Īt the moment this new feature doesn’t help with extracting JS, but we plan on extending this functionality to help scrape conceivably anything from the HTML. You can then select the attribute you wish to extract from the dropdown, and it will formulate the expression for you. For example, if you wish to extract the ‘content’ of an OG tag – You can switch to Rendered or Source HTML view and pick a line of HTML as well. The SEO Spider will then highlight the area you wish to extract, and create an expression for you, with a preview of what will be extracted based upon the raw or rendered HTML. Input the URL you wish to scrape, and then select the element on the page. Just click the web page icon to the side of an extractor to bring up the browser – We’ll then formulate the correct XPath/CSSPath for you, and provide a range of other options as well. To help with this, you’re now able to open a web page in our inbuilt browser and select the elements you wish to extract from either the web page, raw HTML or rendered HTML. This means you can monitor issues by segment in a Looker Studio Crawl Report as well.Ĭustom Extraction is a super powerful feature in the SEO Spider, but it’s also quite an advanced feature and many users couldn’t care less about learning XPath or CSSPath (understandably, so). Within the Export for Looker Studio for automated crawl reports, a separate sheet will also be automatically created for each segment created. You can also choose to create XML Sitemaps by segment, and the SEO Spider will automatically create a Sitemap Index file referencing each segmented sitemap. In crawl visualisations, you can now choose to colour by segment. Segments are fully integrated into various other features in the SEO Spider as well. Once set-up, segments can be saved with the configuration. Or which segments have different types of issues. You can use the Segments tab ‘view’ filter to better analyse items like crawl depth by segment. There’s a new right-hand ‘Segments’ tab with an aggregated view, to quickly see where issues are by segment. You can then use the right-hand segments filter, to drill down to individual segments. When segments are set up, the right hand ‘Issues’ tab includes a segments bar, so you can quickly see where on the site the issues are at a glance. There’s a ‘segments’ column with coloured labels in each tab. You can set up a segment at the start, during, or at the end of a crawl. The segmentation config can be accessed via the config menu or right-hand ‘Segments’ tab, and it allows you to segment based upon any data found in the crawl, including data from APIs such as GA or GSC, or post-crawl analysis. You can now segment a crawl to better identify and monitor issues and opportunities from different templates, page types, or areas of priority. You can also ‘Cancel’ any changes made by using the cancel button on the configuration dialog. System settings such as user interface, language, storage mode and more are available under ‘File > Settings’, in their own unified configuration. There’s been a few small adjustments, such as saving and loading configuration profiles now appearing under ‘Configuration’, rather than the ‘File’ menu. The naming and location of config items should be familiar to existing users, while being easier to navigate for new users. This makes adjusting the config more efficient than opening and closing each separately. The configuration has been unified into a single dialog, with links to each section. While the UX, tabs and filters are much the same, the configuration has received an overhaul. There’s now alternate row colours in the main tables, updated icons and even the Twitter icon and link have been removed (!). These options had previously been within the configuration, so this makes them accessible to free users as well. While subtle, the GUI appearance has been refreshed in look and feel, with crawl behaviour functions (crawl a subdomain, subfolder, all subdomains etc) moved to the main nav for ease. This update contains a number of significant updates, new features and enhancements based upon user feedback and a little internal steer. We’re delighted to announce Screaming Frog SEO Spider version 19.0, codenamed internally as ‘Peel’.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |