Skip to content
Studio 3T - The professional GUI, IDE and client for MongoDB
  • Tools
    • Aggregation Editor
    • IntelliShell
    • Visual Query Builder
    • Export Wizard
    • Import Wizard
    • Query Code
    • SQL Query
    • Connect
    • Schema Explorer
    • Compare
    • SQL ⇔ MongoDB Migration
    • Data Masking
    • Task Scheduler
    • Reschema
    • More Tools and Features
  • Solutions
  • Resources
    • Knowledge Base
    • MongoDB Tutorials & Courses
    • Tool/Feature Documentation
    • Blog
    • Community
    • Testimonials
    • Whitepapers
    • Reports
  • Contact us
    • Contact
    • Sales Support
    • Feedback and Support
    • Careers
    • About Us
  • Store
    • Buy Now
    • Preferred Resellers
    • Team Pricing
  • Download
  • My 3T
search

Academy 3T

  • Explore our courses
    • MongoDB 101: Getting Started
    • MongoDB 201: Querying MongoDB Data
    • MongoDB 301: Aggregation
  • Get certified

Exercise 1: Filtering the documents in the aggregation pipeline

MongoDB 301: Aggregation Building a Basic Aggregation Exercise 1: Filtering the documents in the aggregation pipeline

In this exercise, you’ll launch IntelliShell and create an aggregate statement that includes only one stage. The stage is based on the $match aggregate operator, which lets you filter the documents in the pipeline so you’re working with only a subset of documents in later stages.

In the exercises that follow this one, you’ll add two more stages to the aggregation pipeline, and you’ll add two processing options to the statement, after the pipeline. By the end of the exercises, you’ll have created a statement that returns the number of transactions in each state, as they are stored in the customers collection.

To complete the exercises in this section, you must first download the customers.json file and save it to a local drive that you can access from within Studio 3T. You’ll be using the file to create the customers collection, which you’ll need throughout this course. You can download the customers.json file here. If you’ve already imported the customers collection, you can skip to Step 12.

To filter the documents in the aggregation pipeline

  1. Launch Studio 3T and connect to MongoDB Atlas.
  2. In the Connection Tree, right-click the connection (top-level node) and click Add Database.
  3. In the Add Database dialog box, type sales in the Database Name text box, and then click OK. Studio 3T adds the sales database node to the Connection Tree.
  4. Right-click the sales database node and click Import Collections.
  5. In the Import dialog box, select the JSON option if not selected, and then click Configure. Studio 3T adds the JSON Import tab to the main window.

    On the JSON Import tab, you can add files that contain source data to be imported into the database. The files appear as a list in the tab’s main window, which you can then select for import. 
  1. Click the Add Source Files button (the plus sign just above the main window).
  2. In the file manager dialog box, navigate to the folder where you saved the customers.json file.
  3. Select the file and click Open. Studio 3T adds the file to the import list.
  4. Select the customers.json file from the list, and then click the Execute button on the tab’s toolbar. When the Confirm import dialog box appears, click OK. Studio 3T adds the customers collection to the sales database.
  5. Close the JSON Import tab. If prompted to save your changes, click Discard changes.
  6. In the Connection Tree, expand the sales database node and, if necessary, expand the Collections node. The Connection Tree should now include the customers collection node.
  7. Right-click the customers collection node, and then click Open IntelliShell. Studio 3T adds the IntelliShell tab to the main window. By default, Studio 3T defines a basic find statement on the customers collection object, but you’ll be replacing this statement with an aggregate statement. 
  8. Replace the find statement with the following aggregate statement:
db.customers.aggregate(
  [
    { "$match": { "dob": { "$lt": ISODate("1970-01-01T00:00:00.000Z") } } }
  ]
);

The statement calls the aggregate method on the customers collection. In this case, the aggregate method takes one argument—the pipeline. The pipeline is enclosed in square brackets and includes only one stage, which is enclosed in curly braces. The stage uses the $match operator to return those documents with a dob value before 1970. The top pane of the IntelliShell tab should now look similar to the following figure.

The operator’s expression specifies the dob field, followed by a subexpression. The subexpression includes the $lt (less than) comparison operator and a datetime value for January 1, 1970. The ISODate constructor converts the string to a datetime object, making it possible to carry out the comparison.

  1. On the IntelliShell toolbar, click the Execute button (the green arrow with the screen tip that reads Execute entire script). Studio 3T runs the aggregate statement and displays the results in the tab’s lower pane, as shown in the following figure. The results are displayed in Table View.

The first stage should return 406 documents. If you can’t see the Count Documents button, check that the Enable Query Assist button is on in the Intellishell toolbar. If the pipeline contained a second stage, MongoDB would use these documents when processing that stage.

  1. Leave the IntelliShell tab open and the existing statement in place for the next exercise. You’ll be building on this statement by adding the next pipeline stage.

Back to Lesson
Next Topic
  • Course Home Expand All
    Building a Basic Aggregation
    4 Topics | 1 Quiz
    Exercise 1: Filtering the documents in the aggregation pipeline
    Exercise 2: Grouping the documents in the aggregation pipeline
    Exercise 3: Sorting the documents in the aggregation pipeline
    Exercise 4: Adding processing options to the aggregation
    Building a Basic Aggregation: Test your skills
    Introducing the Aggregation Editor
    4 Topics | 1 Quiz
    Exercise 1: Importing an aggregate statement into the Aggregation Editor
    Exercise 2: Replace a field in the aggregation pipeline
    Exercise 3: Reorder the fields in the aggregation pipeline
    Exercise 4: Changing the sort order in the aggregation pipeline
    Introducing the Aggregation Editor: Test your skills
    Working with Arrays in the Aggregation Pipeline
    5 Topics | 1 Quiz
    Exercise 1: Using expression operators to filter input documents
    Exercise 2: Unwinding an array to create individual documents
    Exercise 3: Grouping array values and generating a document count for each group
    Exercise 4: Writing pipeline results to a new collection
    Working with Arrays in the Aggregation Pipeline: Test your skills
    MongoDB 301 Mid-Course Feedback
    Adding Lookup Data to the Aggregation Pipeline
    4 Topics | 1 Quiz
    Exercise 1: Adding lookup data to the aggregation pipeline
    Exercise 2: Converting string values in one of the lookup fields to integers
    Exercise 3: Adding a computed ratio field based on the converted lookup field
    Exercise 4: Limiting the number of returned documents
    Adding Lookup Data to the Aggregation Pipeline: Test your skills
    Working with Reschema for MongoDB
    4 Topics | 1 Quiz
    Exercise 1: Setting up a reschema unit that includes lookup data
    Exercise 2: Defining a target collection in the reschema unit
    Exercise 3: Adding and scheduling a task to create the target collection
    Exercise 4: Running an aggregate statement against the target collection
    Working with Reschema for MongoDB: Test your skills
    Reporting with Studio 3T Aggregations
    3 Topics | 1 Quiz
    Exercise 1: Creating a view based on an aggregation query
    Exercise 2: Exporting a collection as a .csv file for use by a third-party tool
    Exercise 3: Visualizing collection data in MongoDB Charts
    Reporting with Studio 3T Aggregations: Test your skills
    Course Extras
    Return to MongoDB 301: Aggregation
  • Studio 3T

    MongoDB Enterprise Certified Technology PartnerSince 2014, 3T has been helping thousands of MongoDB developers and administrators with their everyday jobs by providing the finest MongoDB tools on the market. We guarantee the best compatibility with current and legacy releases of MongoDB, continue to deliver new features with every new software release, and provide high quality support.

    Find us on FacebookFind us on TwitterFind us on YouTubeFind us on LinkedIn

    Education

    • Free MongoDB Tutorials
    • Connect to MongoDB
    • Connect to MongoDB Atlas
    • Import Data to MongoDB
    • Export MongoDB Data
    • Build Aggregation Queries
    • Query MongoDB with SQL
    • Migrate from SQL to MongoDB

    Resources

    • Feedback and Support
    • Sales Support
    • Knowledge Base
    • FAQ
    • Reports
    • White Papers
    • Testimonials
    • Discounts

    Company

    • About Us
    • Blog
    • Careers
    • Legal
    • Press
    • Privacy Policy
    • EULA

    © 2023 3T Software Labs Ltd. All rights reserved.

    • Privacy Policy
    • Cookie settings
    • Impressum

    We value your privacy

    With your consent, we and third-party providers use cookies and similar technologies on our website to analyse your use of our site for market research or advertising purposes ("analytics and marketing") and to provide you with additional functions (“functional”). This may result in the creation of pseudonymous usage profiles and the transfer of personal data to third countries, including the USA, which may have no adequate level of protection for the processing of personal data.

    By clicking “Accept all”, you consent to the storage of cookies and the processing of personal data for these purposes, including any transfers to third countries. By clicking on “Decline all”, you do not give your consent and we will only store cookies that are necessary for our website. You can customize the cookies we store on your device or change your selection at any time - thus also revoking your consent with effect for the future - under “Manage Cookies”, or “Cookie Settings” at the bottom of the page. You can find further information in our Privacy Policy.
    Accept all
    Decline all
    Manage cookies
    ✕

    Privacy Preference Center

    With your consent, we and third-party providers use cookies and similar technologies on our website to analyse your use of our site for market research or advertising purposes ("analytics and marketing") and to provide you with additional functions (“functional”). This may result in the creation of pseudonymous usage profiles and the transfer of personal data to third countries, including the USA, which may have no adequate level of protection for the processing of personal data. Please choose for which purposes you wish to give us your consent and store your preferences by clicking on “Accept selected”. You can find further information in our Privacy Policy.

    Accept all cookies

    Manage consent preferences

    Essential cookies are strictly necessary to provide an online service such as our website or a service on our website which you have requested. The website or service will not work without them.

    Performance cookies allow us to collect information such as number of visits and sources of traffic. This information is used in aggregate form to help us understand how our websites are being used, allowing us to improve both our website’s performance and your experience.

    Google Analytics

    Google Ads

    Bing Ads

    Facebook

    LinkedIn

    Quora

    Hotjar

    Reddit

    Functional cookies collect information about your preferences and choices and make using the website a lot easier and more relevant. Without these cookies, some of the site functionality may not work as intended.

    HubSpot

    Social media cookies are cookies used to share user behaviour information with a third-party social media platform. They may consequently effect how social media sites present you with information in the future.

    Accept selected