# Test

https://www.youtube.com/watch?v=tYva3PJU3-I

[00:00] is, you know, coming in more than, you is, you know, coming in more than, you know, a couple days afterwards because know, a couple days afterwards because things shouldn't be updated very late.
[00:08] things shouldn't be updated very late.
[00:10] But we're not building any any artificial logic that says we're only going to update, you know, transactions within the 7-day range.
[00:15] within the 7-day range.
[00:15] We're going to the API will drive what gets updated.
[00:19] Okay.
[00:21] Okay. Got it.
[00:33] okay. So this uh I mean say if we are starting with the five pilot shacks uh just a ballpark number uh uh so this uh are we expecting the API to pull data for the remaining 350 plus shacks?
[00:54] plus shacks? Well, no, because I wouldn't be needed to pull
[00:59] I don't really understand that question.
[01:01] I don't really understand that question.
[01:05] Uh, no.
[01:06] no.
[01:06] No.
[01:10] So, this is uh No.
[01:10] So, this is uh Okay.
[01:13] Sorry, my Okay.
[01:13] Sorry, my um yeah I so Justin this is uh sorry.
[01:16] um yeah I so Justin this is uh sorry this this line item is for the NCR to.
[01:20] this this line item is for the NCR to snowflake.
[01:22] snowflake for the uh for for that 15inut.
[01:26] for the uh for for that 15inut uh uh realtime data.
[01:31] uh realtime data.
[01:33] Okay.
[01:33] So yeah, so.
[01:35] I think we would expect to only pull the data that's available again.
[01:37] And I'm pretty sure the PO like the NCR API works the same way, right?
[01:41] It's based off of delta changes.
[01:45] So okay,
[01:45] whatever checks are changed, that's what should be flowing through in that API payloads.
[01:51] and that's what we should be pushing into the downstream, you know, tables.
[01:56] So theoretically the way it works is as we start to go into pilot let's say that.
[02:01] we start to go into pilot let's say that it is five shacks right like five shacks.
[02:03] it is five shacks right like five shacks would have data coming from the Q API.
[02:06] would have data coming from the Q API the other 360 shacks would be coming.
[02:09] the other 360 shacks would be coming from NCR still because they would still.
[02:11] from NCR still because they would still be on NCR and we would merge the data.
[02:13] be on NCR and we would merge the data together into like the curated layer up.
[02:16] together into like the curated layer up you know as part of the curation.
[02:19] you know as part of the curation. So I think both API approaches should work.
[02:21] think both API approaches should work largely the same way.
[02:25] largely the same way. The the you know details will be slightly different in.
[02:26] details will be slightly different in terms of the key values that we use to.
[02:28] terms of the key values that we use to identify you know what is an update.
[02:31] identify you know what is an update versus what is a new insert that kind of.
[02:33] versus what is a new insert that kind of thing but overall whatever is coming in.
[02:36] thing but overall whatever is coming in through the API payload should be what.
[02:37] through the API payload should be what we reflect in downstream data.
[02:41] we reflect in downstream data. >> Okay. And uh Justin any any work that has um I mean anything that has been shared from the NCR side on on this API or any PC?
[02:44] Okay. And uh Justin any any work that has um I mean anything that has been shared from the NCR side on on this API or any PC?
[02:46] shared from the NCR side on on this API or any PC?
[02:50] or any PC? >> Yeah, I so Alashsh have been working with the team on this. We're have you not been looped into that? I can certainly loop you in but since Alashsh is out this week I don't know how much.
[02:53] Yeah, I so Alashsh have been working with the team on this. We're have you not been looped into that? I can certainly loop you in but since Alashsh is out this week I don't know how much.
[02:55] with the team on this. We're have you not been looped into that? I can certainly loop you in but since Alashsh is out this week I don't know how much.
[02:57] not been looped into that? I can certainly loop you in but since Alashsh is out this week I don't know how much.
[02:58] I can certainly loop you in but since Alashsh is out this week I don't know how much.
[03:00] certainly loop you in but since Alashsh is out this week I don't know how much.
[03:02] is out this week I don't know how much you know progress we'll make but I think
[03:03] you know progress we'll make but I think definitely next week when he's back you
[03:05] definitely next week when he's back you should regroup with him.
[03:07] uh because we've gotten sample payloads from the
[03:10] we've gotten sample payloads from the NCR API for like we have one chat one
[03:14] NCR API for like we have one chat one production chat set up
[03:16] production chat set up >> okay
[03:17] okay >> that will feed data through that API um
[03:19] that will feed data through that API um so that is I think you know he's uh uh
[03:23] so that is I think you know he's uh uh ultimos received that from uh
[03:26] ultimos received that from uh integration team um earlier this week or
[03:28] integration team um earlier this week or end of last week so I think now it's
[03:31] end of last week so I think now it's just working through that but certainly
[03:33] just working through that but certainly you should be looped into that so um I I
[03:36] you should be looped into that so um I I would say um I'll find the email I have
[03:39] would say um I'll find the email I have was copied on that last email to Altim
[03:42] was copied on that last email to Altim and I'll forward it along to you Sandep
[03:44] and I'll forward it along to you Sandep um
[03:44] um >> but I would suggest you reach out to
[03:46] but I would suggest you reach out to Altosh Monday when he's back to you know
[03:48] Altosh Monday when he's back to you know have a follow-up conversation.
[03:50] have a follow-up conversation. >> Okay. So we would have this uh the API
[03:53] Okay. So we would have this uh the API data for the for these three uh
[03:55] data for the for these three uh transaction.
[03:57] transaction. >> No, that's not how it works. It's a
[03:59] No, that's not how it works. It's a again like I said nothing works like
[04:02] again like I said nothing works like it's not going to match the replication
[04:05] it's not going to match the replication database.
[04:06] It's going to look more like database.
[04:08] It's going to look more like what we're getting from Q today, right?
[04:12] It's going to be a check level JSON that includes everything in that JSON.
[04:16] Okay. So, you mean even the item tender would everything would be included in the check similar to how Q
[04:23] Yeah, I'm pretty sure. And like I said, I'll I'll forward you along the info that we have and I don't know if if he had a payload attached in there or where the payload was shared, but um and and we can get you also the documentation um from, you know, NCR, but ultimately,
[04:39] yeah, it's it's a very similar approach to the way the Q setup works.
[04:44] It's, you know, we're the API is based off of anything that's changed in the last 15 minutes and it is a basically every check will come through as sort of a separate item with all the details around the check and the items tied to that check and the payment tied to that check etc.
[04:59] Okay.
[05:02] Okay. Okay. Yeah. I mean, if you have the
[05:03] Okay.
[05:03] Yeah.
[05:03] I mean, if you have the documentation, the API documentation, if you can share, then I I can go through.
[05:08] Um, yeah, I'll have to see if I have it.
[05:10] Um I think probably work with you know I'll I don't know I won't have time probably look at it today but I'll see if I can find it I'll for you on the email that I was copied on at least at the starting point.
[05:22] Sure.
[05:22] Um so Justin what is the timeline for this requirement?
[05:28] I mean um so for I can't tell you that.
[05:30] I don't have a timeline because that's that you that needs to be something you coordinate with Blaine and the larger POS project team because the timeline for us to be able to build it is one thing.
[05:41] There's still like the like contract things that need to happen.
[05:47] So I guess this is TBD.
[05:47] Um and and part of it is driven by when we're going to have resources available to focus on the POS, you know, work.
[05:58] Um because we don't have any dev resources right now allocated to work on POS.
[06:00] Basically, you're sort of the the one, you know, resource we have
[06:05] the the one, you know, resource we have on the line tree side working on this.
[06:07] on the line tree side working on this.
[06:11] So, I'm going to um talk with Blaine and and um you know follow up on next steps
[06:14] to try to get um the like sort of more
[06:19] robust mind tree team in place to drive
[06:21] this forward. But I don't I can't tell
[06:23] you what that looks like. I think for
[06:24] now we're just doing discovery to
[06:26] understand like what we think like the
[06:28] effort might be.
[06:30] Okay. But but is the expectation like
[06:33] when the pilot is phase is ready for the
[06:36] five shacks on Q
[06:40] um uh is it I mean should we also be
[06:43] ready with the NCR real time.
[06:47] So I think again that's a question note
[06:51] that's for our red team to answer but my
[06:54] understanding is that yes for us to go
[06:56] into pilot we need to have a solution
[06:57] where either we are providing
[07:01] nearrealtime reporting across all shacks
[07:04] both Q and NCR through Tableau or we
[07:07] both Q and NCR through Tableau or we have a feed set up to send data from NCR.
[07:12] have a feed set up to send data from NCR near real time you know data feed that we're loading to Q in the right way for them to show that in the notify app.
[07:14] near real time you know data feed that we're loading to Q in the right way for them to show that in the notify app.
[07:19] we're loading to Q in the right way for them to show that in the notify app.
[07:21] them to show that in the notify app.
[07:23] That is an ongoing discussion to understand if that's even feasible and what would be required to make that happen.
[07:25] understand if that's even feasible and what would be required to make that happen.
[07:27] what would be required to make that happen.
[07:29] So again, that's a question for the program team because there's no decision made on even how we're going to do it yet.
[07:31] the program team because there's no decision made on even how we're going to do it yet.
[07:33] decision made on even how we're going to do it yet.
[07:36] do it yet.
[07:36] Okay. Um do we have any Tableau dashboard in scope for this live reporting or is it is it only the Q notify?
[07:40] Okay. Um do we have any Tableau dashboard in scope for this live reporting or is it is it only the Q notify?
[07:41] dashboard in scope for this live reporting or is it is it only the Q notify?
[07:43] reporting or is it is it only the Q notify?
[07:45] notify?
[07:47] So that as I just said, okay, that decision has not been made yet.
[07:47] okay,
[07:49] that decision has not been made yet.
[07:54] Okay.
[07:54] So this is something I can get from uh Adamus.
[07:56] Okay. So this is something I can get from uh Adamus.
[07:59] from uh Adamus.
[07:59] uh Adamus.
[07:59] I mean, you can definitely ask. I don't think they have an answer, but that's an answer they need to provide. I can provide it.
[08:01] I mean, you can definitely ask. I don't think they have an answer, but that's an answer they need to provide. I can provide it.
[08:03] think they have an answer, but that's an answer they need to provide. I can provide it.
[08:04] answer they need to provide. I can provide it.
[08:05] provide it.
[08:08] Okay.
[08:10] Uh yeah, this uh I think first and second are almost the same.
[08:16] data for the Q notify app.
[08:21] data for the Q notify app.
[08:23] Uh again, this this would be a design question, right?
[08:25] How how this data gets sent over to Q from Snowflake
[08:29] and if it does, it's still a question of whether they can even do this or how.
[08:34] So yes, it is a design question and even a like if whether it's going to happen or not question.
[08:37] like if whether it's going to happen or not question.
[08:43] Okay.
[08:47] Okay. Yeah. So, uh the third one is for the Q to snowflake.
[08:53] to snowflake.
[08:56] Yeah. These are the list of APIs that will run every 15 minutes.
[08:58] Uh pull in the incremental data.
[09:01] Load the JSON files from S3 to Snowflake.
[09:05] files from S3 to Snowflake.
[09:07] Uh we'll have the job schedule at our
[09:10] Uh we'll have the job schedule at our end.
[09:12] end.
[09:12] Uh now for this um so Justin I wanted to
[09:17] Uh now for this um so Justin I wanted to check with you on the um
[09:22] check with you on the um uh how we plan to I mean how do you
[09:25] uh how we plan to I mean how do you visualize this?
[09:30] Uh so let me just I created a
[09:55] Just need some shine.
[10:15] So just I was um I mean I I was for I was forcing like two appro options uh when it comes to loading uh the Q data into snowflake and then uh having it exposed to the report layer.
[10:32] Okay.
[10:32] Option one, mesh, because we're not going to bring together the two raw layers because they're not the same raw layer.
[10:39] Yeah, you can't create a view off of a raw layer to have like a combined raw layer.
[10:46] Okay. uh option two it's it's like creating a separate flow for Q and then having them combined on the
[10:59] so neither that's not the right approach either let me show you what the approach I have in mind is
[11:06] okay it's similar to what you just showed but that's not exactly it um so let me share here
[11:16] Okay, so this is something I just started working on.
[11:19] Don't look too much at the formatting, etc.
[11:22] But um basically the idea is we're going to have the QPOS API.
[11:25] We're going to have the NCR and the API, right?
[11:27] Both of these are new integrations.
[11:30] Um so we're going to bring them in as near realtime integrations.
[11:34] And then we're going to combine the near realtime data into a standardized checkin item level data model that brings together the Q and Aloha data from the APIs.
[11:47] This data model would largely mimic what we have today in our like fact sales and fact sale items.
[11:52] There might be a couple other you know things that we include in there um as necessary.
[11:56] But the idea would be that it's basically you know a check and item level view.
[11:59] Maybe some payment details.
[12:03] I don't know how critical that would be for um you know go live versus longer term but that's sort of the approach right so we bring this together and then this data becomes available to either feed into existing batch reporting for Q so we can
[12:19] batch reporting for Q so we can basically say we would take data from here and feed it into you know the tables that drive any of our existing analytics data or reporting data.
[12:30] So probably largely that guest sales view table um to keep existing batch reporting whole and include the Q data in there.
[12:40] So theoretically as an NCR store migrates to Q the data stops coming through the replication server and comes through the Q API.
[12:52] Um so that keeps us whole and also allows us to potentially do any near realtime reporting we need to do whether that is you know building a replacement for like the notify app if can't support you know feeding the data in for that or any other new use cases and this also this as we start to look at building new data products this also becomes the source of
[13:21] products this also becomes the source of data for those because at the end of the day we want to build anything new off of the near realtime data as much as possible so that when we turn off P like NCR just Q data is flowing and we already have the data sets that are whole for Q.
[13:38] So this is sort of the conceptually how I think it needs to work.
[13:45] Okay.
[13:47] Okay.
[13:51] So the standardized model here is it is it a is it a complete new model or is it using the existing model?
[14:01] As I said a few minutes ago, it's a new model that is basically a replication of our existing fat sales items and fax sales fees, right?
[14:13] like it's it's a new version of the check and item level data and probably you know strictly speaking it will be very similar although
[14:21] it will be very similar although slightly different because we're going slightly different because we're going to do it not we're going to have to to do it not we're going to have to figure out how we merge figure out how we merge >> Q and NCR data together with the goal of >>> Q and NCR data together with the goal of you know keeping sort of similar you know keeping sort of similar functionality but we have to make sure functionality but we have to make sure that it accommodates the Q data.
[14:38] that it accommodates the Q data.
[14:38] Yeah. Right. Yeah.
[14:41] Yeah, since uh I mean the Q reference data say for example the order mode ids will be different as compared to as compared to >> yeah well we have to figure out how to >> yeah well we have to figure out how to bridge that anyway to be able to fit it into a batch reporting right so so yes and that's where all the work is right the actually building the integrations and getting the data loaded
[15:04] and getting the data loaded >> that doesn't actually involve any >> that doesn't actually involve any thought right all it is is taking data and loading what you get and loading what you get >> this is the hard part the building the >> this is the hard part the building the you know combined you know combined >> version of data and then going and >> version of data and then going and updating existing reporting to use the updating existing reporting to use the new Q data and then building out any new new Q data and then building out any new you know reporting off of the near
[15:22] you know reporting off of the near realtime version
[15:27] okay
[15:29] okay
[15:31] so I'm going to flush this out a little bit more and add some more details but
[15:33] this is you know what I mentioned um you know end of last week
[15:37] earlier this week that I wanted to build out so we have clarity around how conceptually the data flows need to work
[15:45] and actually what you know is in scope to be built but basically it's the new integrations which we've already started on
[15:52] and we have you know a pretty good you know step forward on this is the hard part
[15:59] and this is where most of the work is going to go
[16:00] and then you know figuring out how we keep existing reporting and whatever we build new
[16:04] the new is still TBD because right now nothing is being asked for other than potentially you know sending the NCR data to Q or building a sales dashboard to replace you know the near realtime reporting availability in QNCR
[16:25] okay.
[16:29] any questions any other questions?
[16:32] any questions any other questions?
[16:35] um so one question I had was on the Um uh.
[16:39] so one question I had was on the Um uh the MVP list of uh that do you want to share your screen again?
[16:44] share your screen again?
[16:44] Uh no I mean I didn't have much but yeah.
[16:46] let me share my screen.
[16:56] Yes.
[16:59] So right now we have started uh looking at the guest sales view the fact sale items table.
[17:05] sale items table.
[17:06] um uh the the M the MVP list of reports that will need to be ready for pilot.
[17:13] That's still an open question.
[17:17] So I would look at it this way.
[17:20] It's not a list of reports.
[17:24] It is the sales data model we have needs to be updated and
[17:26] model we have needs to be updated and then like the reporting you know that we then like the reporting you know that we would do would reflect that right.
[17:34] So like I would think about it less about reporting and more about like what data objects need to be updated with this sales data.
[17:40] I mean if you want to backtrack from reports that's fine but like as an example fact sales needs to be updated even if it's not critical for like a reporting object right so I think it's it's both like the reports come after the data layer update so the focus should be on the data layer but if you want to backtrack from reporting to make sure we are capturing all the objects in the data layer fine but the expectation is everything as of right now until somebody tells us otherwise.
[18:15] Okay.
[18:18] Okay.
[18:24] Um Okay. Uh yeah, that's that's what um
[18:29] Okay.
[18:29] Uh yeah, that's that's what um uh the Aloha to workday integration.
[18:32] uh the Aloha to workday integration.
[18:32] That's uh that's still an open question,
[18:36] That's uh that's still an open question, right?
[18:38] Are we >> Yep.
[18:38] I don't know who's doing it or what if we're doing it or somebody else is doing it and how that all works.
[18:43] So that is another question for the program team to confirm and finalize.
[18:51] Okay.
[18:55] Okay.
[18:55] Uh yeah that's all I I hope I have covered.
[18:58] I mean whatever the items are in scope.
[19:00] I know I have to add more details into it uh based on our discussion.
[19:04] But just is there anything uh any other line item that I'm missing in terms of the high deliverables?
[19:16] I suppose at a high level of no cuz you have the MCR real time.
[19:22] Where's the Q one? I guess I didn't see.
[19:24] Yeah, this is this is the Q.
[19:26] Oh, the third one. Okay, down there.
[19:27] Third one.
[19:29] Um
[19:30] Um and then we have Yeah, the design approach.
[19:34] That's that's so this is more of a raw layer and this is more of that uh standardized.
[19:39] Yeah. So I would just call like I think the I mean the design approach should be creating a consolidated curate layer updating all existing you know data objects that have sales or you know other POSs source data and ensuring that everything's working you know based on that but that's like the highle deliverable for that is bringing the data together and making it available in our you know existing analytics data sets and reporting data sets.
[20:14] Yes. And then and then the uh I mean next is the Tableau dashboards.
[20:16] So Tableau dashboards also there would be some effort um that would be needed say for example the order mode
[20:30] say for example the order mode and we'll have to yeah analyze on what
[20:33] and we'll have to yeah analyze on what what how does the reporting impact.
[20:35] what how does the reporting impact. >> So nothing should be changing in the
[20:37] >> So nothing should be changing in the dashboards. We have to make sure we can
[20:39] dashboards. We have to make sure we can fit the data into the NCR approach now,
[20:42] fit the data into the NCR approach now, right? So that's why if we as we talk
[20:45] right? So that's why if we as we talk about order mode, we need to make sure
[20:47] about order mode, we need to make sure that we have 100% certainty about how
[20:49] that we have 100% certainty about how we're going to map Q order modes to the
[20:51] we're going to map Q order modes to the relevant NCR version. Or if we say that
[20:55] relevant NCR version. Or if we say that we're not going to, you know, have the
[20:57] we're not going to, you know, have the same, then we need to understand like
[21:00] same, then we need to understand like what are we going to do differently? Um
[21:03] what are we going to do differently? Um because everybody's going to expect to
[21:05] because everybody's going to expect to be able to filter by existing order
[21:07] be able to filter by existing order modes and get the response because NCR
[21:11] modes and get the response because NCR data is not changing. We're not changing
[21:12] data is not changing. We're not changing the order mode of NCR to match Q. So for
[21:16] the order mode of NCR to match Q. So for I think at least the interim we're going
[21:18] I think at least the interim we're going to have to make sure Q matches, you
[21:21] to have to make sure Q matches, you know, NCR in all of our existing
[21:23] know, NCR in all of our existing reporting. Hopefully that won't be a
[21:25] reporting. Hopefully that won't be a problem. And if it is, we're going to
[21:26] problem. And if it is, we're going to then have to talk through like what are
[21:28] then have to talk through like what are our options to resolve, you know, the
[21:30] our options to resolve, you know, the issues. Um, so I I think that much like
[21:35] issues. Um, so I I think that much like other stuff like you know items also
[21:37] other stuff like you know items also like the expectation is not like we have
[21:39] like the expectation is not like we have two item codes. It's that what we're
[21:42] two item codes. It's that what we're showing today works the same way and
[21:44] showing today works the same way and however we have to you know finagle it
[21:45] however we have to you know finagle it on the back end to make it happen. That
[21:47] on the back end to make it happen. That should be transparent to the end user
[21:50] should be transparent to the end user and should not show up in any of the
[21:52] and should not show up in any of the existing you know curated data sets
[21:54] existing you know curated data sets today. That all has to be handled before
[21:57] today. That all has to be handled before the data makes it into those.
[21:59] the data makes it into those. >> Yeah. So item item ids uh should not be
[22:03] >> Yeah. So item item ids uh should not be a problem since um they are being
[22:06] a problem since um they are being maintained the same item ID from NC.
[22:09] maintained the same item ID from NC. >> Yeah. I mean I mean
[22:10] >> Yeah. I mean I mean >> I think again the devil's in the details
[22:12] >> I think again the devil's in the details and we need to see the clear mapping and
[22:14] and we need to see the clear mapping and approach and that's the way the sooner
[22:15] approach and that's the way the sooner we can start on development and you know
[22:17] we can start on development and you know building some of this to see how it
[22:19] building some of this to see how it actually comes together the the better
[22:21] actually comes together the the better off we're going to be. So um I need to
[22:24] off we're going to be. So um I need to drop to another meeting. Um but I think
[22:26] drop to another meeting. Um but I think at a high level we've captured you know
[22:28] at a high level we've captured you know the items. I think we just need to start
[22:30] the items. I think we just need to start putting together more clear plans
[22:32] putting together more clear plans around, you know, the actual steps
[22:34] around, you know, the actual steps involved, at least in the ones we know
[22:36] involved, at least in the ones we know about. And to your, you know, what we
[22:37] about. And to your, you know, what we discussed earlier,
[22:39] discussed earlier, >> questions around timing and stuff like
[22:40] >> questions around timing and stuff like that, that needs to be something the
[22:42] that, that needs to be something the program team provides because
[22:44] program team provides because >> there's dependencies that we're not
[22:46] >> there's dependencies that we're not managing.
[22:47] managing. >> Okay. All right. Sounds good. Okay.
[22:49] >> Okay. All right. Sounds good. Okay. >> I'll talk to you all later. Thanks.
[22:50] >> I'll talk to you all later. Thanks. >> Thank you so much. Bye. Bye. Thank you.
