Human Generated Data

Title

Untitled (group of people lined up outside "Creamette")

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5387

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people lined up outside "Creamette")

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5387

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Meal 100
Food 100
Restaurant 99.8
Person 99.4
Human 99.4
Person 99.3
Person 98.3
Person 97.1
Person 96.7
Person 95.7
Person 94.9
Diner 92.2
Person 92
Person 81.4
Person 71.3
Person 66.8
Person 66.4
Person 55.6
Person 48.7
Person 43.9

Clarifai
created on 2023-10-27

people 99.7
man 97.2
street 96.9
monochrome 96.8
group 96.6
group together 96.2
adult 94.5
woman 92.1
stock 90.2
many 89.1
transportation system 88.7
restaurant 87.6
shopping 85.8
commerce 84.4
sunblind 84.2
booth 83.7
city 83
market 82.2
business 81.6
recreation 80.5

Imagga
created on 2022-01-23

carousel 100
ride 96.2
mechanical device 72.1
mechanism 50.7
sky 25.5
travel 24.6
device 24.3
city 24.1
architecture 18.9
building 18.6
industry 16.2
urban 13.1
vacation 13.1
landscape 12.6
sea 12.5
stall 12.5
ocean 12.4
destination 12.2
water 12
street 12
gas 11.6
tourism 11.5
cloud 11.2
construction 11.1
beach 11
industrial 10.9
road 10.8
pump 10.8
transportation 10.8
fuel 10.6
business 10.3
station 10.3
winter 10.2
power 10.1
transport 10
snow 10
holiday 10
house 10
structure 10
tree 10
tourist 10
river 9.8
sun 9.7
wagon 9.6
tropical 9.4
clouds 9.3
oil 9.3
energy 9.2
outdoors 9
trees 8.9
night 8.9
scenic 8.8
bridge 8.5
summer 8.4
facility 8.3
exterior 8.3
island 8.2
landmark 8.1
sand 7.9
work 7.8
wheeled vehicle 7.8
culture 7.7
old 7.7
outdoor 7.6
relax 7.6
trip 7.5
sign 7.5
palm 7.5
leisure 7.5
recreation 7.2
resort 7.2
park 7.2
platform 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
outdoor 91.5
fast food 75
person 70.3
people 67.3
billboard 51.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 73.3%
Sad 80.8%
Confused 10%
Happy 2.9%
Calm 2.6%
Disgusted 1.3%
Angry 1.1%
Fear 0.7%
Surprised 0.5%

AWS Rekognition

Age 37-45
Gender Female, 63.4%
Sad 76.3%
Calm 21.3%
Confused 0.8%
Happy 0.7%
Angry 0.3%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 19-27
Gender Female, 74.3%
Sad 97.5%
Disgusted 0.8%
Calm 0.7%
Happy 0.4%
Angry 0.2%
Fear 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Female, 76.3%
Calm 92.1%
Happy 3.2%
Sad 2.5%
Disgusted 0.6%
Confused 0.6%
Angry 0.5%
Fear 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

ICE
26931.
Creametic
ICE CORHINO
Woup
Woup Finga
Finga
CORHINO

Google

ICE
CREAD
ICE CREAD ENA 26931.
ENA
26931.