Human Generated Data

Title

Untitled (customers in neighborhood market)

Date

February 18, 1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18099

Human Generated Data

Title

Untitled (customers in neighborhood market)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

February 18, 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Person 99.4
Shop 96.3
Shoe 96
Clothing 96
Footwear 96
Apparel 96
Person 95.3
Person 85.8
Shoe 84.4
Indoors 79.6
Interior Design 79.6
Shelf 76.8
Town 74.3
Metropolis 74.3
Urban 74.3
City 74.3
Building 74.3
Newsstand 62.1
Grocery Store 62.1
Market 57.4
Shoe 50.2

Imagga
created on 2022-03-04

supermarket 84.4
grocery store 63.7
marketplace 48.7
mercantile establishment 40.2
building 34.6
interior 34.5
cafeteria 29.7
restaurant 29.6
center 28.5
modern 25.2
inside 24.8
chair 22.3
place of business 21.6
business 21.3
architecture 19.5
shop 19
room 17.2
empty 17.2
office 16.5
furniture 16.4
indoors 15.8
urban 15.7
table 15.6
floor 14.9
light 14
counter 13.8
travel 13.4
row 13
place 13
industry 12.8
hall 12.7
work 12.6
transportation 12.6
design 12.4
station 11.7
public 11.7
city 11.6
gate 11.1
indoor 11
wood 10.8
store 10.4
industrial 10
corridor 9.8
train 9.6
hotel 9.5
wall 9.4
house 9.2
window 9.2
airport 9.1
plant 9
reflection 8.9
people 8.9
technology 8.9
chairs 8.8
factory 8.7
luxury 8.6
glass 8.6
nobody 8.5
3d 8.5
library 8.4
lights 8.3
equipment 8.1
warehouse 8.1
steel 8
shelf 7.9
space 7.8
lamp 7.6
perspective 7.5
buy 7.5
commercial 7.5
service 7.4
vacation 7.4
classroom 7.4
transport 7.3
home 7.2
decor 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.3
building 99.1
clothing 95.9
person 94.7
man 86.1
scene 82.5
marketplace 69.5
black and white 65.5
several 14.3
shop 9.5

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 97.6%
Calm 72.3%
Happy 14.7%
Fear 3.8%
Angry 2.6%
Disgusted 2.4%
Sad 2.1%
Surprised 1.7%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 96%

Captions

Microsoft

a group of people in a store 76.3%
a group of people standing in front of a store 76%
a group of people in front of a store 75.9%

Text analysis

Amazon

MOXTE
129
BOS
acioth
miradoth
miradati

Google

raceth
raceth