Human Generated Data

Title

Untitled (group of people lined up outside "Creamette")

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5388

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people lined up outside "Creamette")

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.6
Person 99.4
Person 99.3
Person 99.3
Person 99.1
Meal 98.9
Food 98.9
Person 98.9
Person 98.7
Person 98.6
Person 98.3
Person 98.1
Restaurant 93.4
Shoe 91.1
Footwear 91.1
Clothing 91.1
Apparel 91.1
Shoe 90.9
Kiosk 90.7
Bus Stop 84.2
Person 82.5
Person 82.2
Train 76.5
Transportation 76.5
Vehicle 76.5
Diner 64.9
Person 61.6
People 60.1
Bus 60
Cafeteria 60
Person 57.9
Pedestrian 56.4
Person 47.1
Person 43.5

Imagga
created on 2022-01-23

carousel 30.7
passenger 28.7
ride 26.2
people 20.6
mechanical device 17.9
travel 17.6
transport 16.4
wagon 15.5
sky 15.3
transportation 15.2
car 15
adult 14.2
wheeled vehicle 14
vehicle 13.8
umbrella 13.7
vacation 13.1
mechanism 11.8
parasol 11.2
women 11.1
lifestyle 10.8
leisure 10.8
man 10.7
outdoor 10.7
auto 10.5
old 10.4
water 10
person 9.9
fun 9.7
day 9.4
youth 9.4
speed 9.2
city 9.1
business 9.1
summer 9
urban 8.7
work 8.6
automobile 8.6
world 8.6
men 8.6
outside 8.5
beach 8.5
black 8.4
attractive 8.4
ocean 8.3
outdoors 8.2
chair 8.1
clothing 8.1
road 8.1
sand 7.8
couple 7.8
architecture 7.8
sitting 7.7
move 7.7
happy 7.5
container 7.4
holiday 7.2
male 7.1
working 7.1
happiness 7
sea 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
person 93.1
clothing 84.5
footwear 59.3
old 42.6

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 78.8%
Sad 43%
Calm 25.4%
Happy 11.3%
Confused 8.1%
Disgusted 6.4%
Angry 3%
Fear 2.2%
Surprised 0.7%

AWS Rekognition

Age 28-38
Gender Male, 99.3%
Calm 50.5%
Sad 47.9%
Confused 0.7%
Happy 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 91.1%
Train 76.5%
Bus 60%

Captions

Microsoft

a group of people standing in front of a building 82.3%
a group of people standing in front of a bus 36.2%
a group of people around each other 36.1%

Text analysis

Amazon

26922
Beamer
Democracy

Google

26922 26922.
26922
26922.