Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenRTB support: Bid cache #216

Closed
dbemiller opened this issue Nov 22, 2017 · 7 comments · Fixed by #276
Closed

OpenRTB support: Bid cache #216

dbemiller opened this issue Nov 22, 2017 · 7 comments · Fixed by #276
Assignees
Labels

Comments

@dbemiller
Copy link
Contributor

dbemiller commented Nov 22, 2017

The /openrtb2/auction endpoint needs to support sending bids to the cache.

Right now, I don't think we have a good enough sense of the cache's use-cases to build a quality feature here. This is the fundamental tradeoff:

  • Bids cached unnecessarily waste computing resources of the PBS host.
  • Bids not cached limits the publisher, to varying extends depending on the context.

If the cache behavior needs to be "chosen", because different behaviors have equal merit, then we have two options:

  • If the choice lies with the publisher, then we can add to the OpenRTB request.ext
  • If the choice lies with the host, we can add to the PBS Config

However... I also want to note the following, which we should take very seriously:

  • Each "choice" adds code complexity, which increases the time of all future development.
  • Choices are easy to add, but very hard to remove. Users will grow to depend on them.

To grow this project smoothly, our goal should be to identify the fewest "choices" necessary to encourage adoption on both sides.


These are things to consider:

  • The Prebid.JS philosophy is: "the publisher ultimately decides which bid wins." If PBS caches the bids, Prebid auctions will have overall better performance. However, it is possible for JS expose an API which caches only the bid which the publisher chooses.

  • The prebid-mobile philosophy is: "App code updates are expensive. It's better if the Server does everything, including picking the winning bid." As far as I can tell, there's no room for negotiation here.

  • @DucChau mentioned that Rubicon wanted an option to cache just the vast XML. Currently PBS caches a JSON blob with other info too.

  • There's some ongoing debate about the cache's usefulness for certain types of bids in Do not cache bids which round down to $0.00 CPM #199.


These are some deficiencies in the /auction endpoint's cache behavior. Ideally, we can do it better in openrtb2/auction.

  • Bids aren't cached until the auction has completed. So if any bidder in the auction times out, then the cache calls do too. This basically wastes the auction on mobile, since it requires the bid to be cached.
  • @DucChau noted that /auction doesn't let you cache just the vast XML. It requires the JSON blob. If this is a valid use-case, then the /openrtb2/auction API should to support this too.

@bretg and @mjacobsonny might want to discuss this. At the moment, I don't think it's fleshed out well enough for much useful engineering work to be done.

@dbemiller dbemiller changed the title OpenRTB support: Bid cacheing OpenRTB support: Bid cache Nov 22, 2017
@dbemiller
Copy link
Contributor Author

Got an interesting suggestion today from someone trying to help simplify the "what to cache?" options.

It was: always cache the entire OpenRTB bid. Expose GET endpoints on the cache so that people can query a subset of the info.

So for example:

GET prebid-cache-url.com/cache?uuid=abc&keys=adm: fetch just the ad markup
GET prebid-cache-url.com/cache?uuid=abc: fetch the entire JSON bid

This would basically reduce the complexity of prebid-server, at the expense of prebid-cache.

Not saying it's a great idea... but it doesn't seem like a bad one, and it's creative enough I thought I'd mention it.

@mkendall07
Copy link
Member

Interesting concept but doesn't that that assume JSON is the structure? Also we are storing a lot of stuff we probably won't ever need.

@dbemiller
Copy link
Contributor Author

yeah... we talked about storing stuff we don't need, but the rest of the Bid is tiny compared to the creative markup.

I am concerned about the JSON structure requirement, though. In order to return a Content-Type: application/xml page, either PBC would need to know about the adm field specifically, or the GET API would have to let clients tell PBC which format they expected for the response.

Another concern came to mind after thinking about it some more:

  • In OpenRTB, adm isn't a required field. The bid's nurl can also return VAST. So that API doesn't make a great workflow for a client who just wants the creative content--no questions asked.
  • Not all adm is XML. If a bid is cached for an AdUnit which could be banner or video, then the caller might not even know for sure whether the adm field is storing XML or html/js. This commits PBC to knowing an awful lot about OpenRTB and the data formats.

Overall... I'm not too crazy with this idea. It seems to make things a tiny bit simpler for PBS, but much more complex for PBC.

@dbemiller
Copy link
Contributor Author

I think I've got something reasonable which hits all the use-cases. Originally, we can add an option like this inside bidrequest.ext.prebid:

{
  "cache": {
    "bids": {
      "winners": true,
      "deals": true
    }
  }
}

bids is required, but winners and deals are optional. By default, we will only cache "winners".


Depending on which use-cases makes sense in the future, we can expand the API to something like this without introducing any breaking changes:

{
  "cache": {
    "vastxml": {
      "winners": true,
      "deals": true,
      "all": true,
      "cpmfloor": 0.01
    },
    "bids": {
      "winners": true,
      "deals": true,
      "all": true,
      "cpmfloor": 0.01
    }
  }
}

Whenever we introduce vastxml, bids will become optional and only one of bids or vastxml will be required. vastxml would describes which bids will have just their adm stored as XML strings.

@dbemiller
Copy link
Contributor Author

Per the discussion in #199, cpmfloor is not needed. Bids which round to $0.00 CPM should not be cached, even if they're "winners" or "deals".

@dbemiller dbemiller added the ready label Jan 4, 2018
@dbemiller dbemiller self-assigned this Jan 4, 2018
@dbemiller
Copy link
Contributor Author

dbemiller commented Jan 9, 2018

per some more discussion with @mjacobsonny, I learned some things. Today, Prebid publishers expect:

hb_uuid -- applies to the top overall bid for each Imp
hb_uuid_{bidder} -- applies to the top bid from {bidder} for each Imp

There is a problem with this today. There is no good support for a Bidder who makes two bids on the same Imp for separate Deals. It was proposed to add a new key to support this:

hb_uuid_{dealId} -- applies to every bid with a DealID

However, he also described a business use-case for hb_uuid_{bidder} even if there aren't Deal IDs.

So... it's my opinion that improving Deal support should be a future improvement. Since we're not deprecating hb_uuid or hb_uuid_bidder, and the Deal support proposal wouldn't cause breaking changes to the API, there's absolutely no reason to worry about them now.

All that in mind... this is the behavior I implemented:

{
  "cache": {
    "bids": {
      // This is an object, but no options exist yet. Some may be added in the future
    }
  }
}

This will apply hb_uuid and hb_uuid_{bidder} to the bids as described above.

@ghost ghost added in progress and removed ready labels Jan 10, 2018
@ghost ghost removed the in progress label Jan 17, 2018
@dbemiller
Copy link
Contributor Author

One last-minute change to this before merge... we replaced hb_uuid with hb_cache_id.

Besides just being a better name, this will save existing Prebid Mobile publishers from having to update their DFP creatives when they switch to OpenRTB.

allar15 pushed a commit to allar15/prebid-server that referenced this issue Nov 24, 2023
StarWindMoonCloud pushed a commit to ParticleMedia/prebid-server that referenced this issue Jun 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants