Human Generated Data

Title

Untitled (camera crew surrounded by crowd)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5054

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (camera crew surrounded by crowd)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5054

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 98.3
Person 97.4
Person 97.4
Person 96.6
Person 96.3
Person 95.5
Person 95.2
Person 95
Person 92.3
Person 91.1
Person 88.3
Person 83.1
Person 82.9
Person 81.8
Interior Design 80.9
Indoors 80.9
Person 77.1
Person 75.2
Crowd 74.2
People 72
Person 71.6
Shorts 62.7
Clothing 62.7
Apparel 62.7
Handrail 62.1
Banister 62.1
Person 58.5
Theme Park 56.9
Amusement Park 56.9
Staircase 56.5
Person 51.1

Clarifai
created on 2023-10-26

people 99.9
many 99.7
group 99.4
child 99.3
adult 98.6
elementary school 96.8
boy 96.6
man 96.2
woman 95.1
group together 94.7
audience 94.2
monochrome 93.8
education 93.6
crowd 93.2
classroom 92.7
furniture 91.8
sit 91.3
music 90.8
war 90.1
school 89.6

Imagga
created on 2022-01-22

sax 100
wind instrument 29.2
industry 26.5
sky 25.5
industrial 25.4
construction 24.8
building 22.3
power 20.1
stage 19.7
factory 19.3
architecture 18
tower 17.9
platform 17.5
crane 17.1
work 15.7
city 15
equipment 14.4
cloud 13.8
gas 13.5
energy 13.4
steel 13.3
structure 13.1
smoke 13
fuel 12.5
pollution 12.5
urban 12.2
new 12.1
business 11.5
environment 11.5
technology 11.1
engineering 10.5
plant 10.4
brass 10
silhouette 9.9
pipe 9.7
black 9.6
development 9.5
skyline 9.5
modern 9.1
old 9.1
night 8.9
light 8.7
high 8.7
concrete 8.6
grunge 8.5
bridge 8.5
oil 8.4
house 8.4
vintage 8.3
dirty 8.1
history 8
water 8
machine 7.9
design 7.9
drawing 7.9
chimney 7.8
pump 7.8
scene 7.8
production 7.8
men 7.7
chemical 7.7
downtown 7.7
site 7.5
metal 7.2
river 7.1
working 7.1
day 7.1
travel 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.9
person 94.3
clothing 82.9
black 80.6
player 73.8
man 72.5
old 69.6
white 62.8
crowd 32.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 93.7%
Calm 58.2%
Confused 14.6%
Happy 12.3%
Sad 9.9%
Surprised 1.6%
Angry 1.5%
Disgusted 1%
Fear 0.9%

AWS Rekognition

Age 20-28
Gender Female, 95.2%
Calm 97.8%
Happy 0.9%
Fear 0.6%
Sad 0.3%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Male, 80.9%
Calm 92.1%
Sad 2.6%
Fear 2.1%
Angry 1.2%
Happy 0.8%
Disgusted 0.6%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 16-22
Gender Female, 74.9%
Calm 50.3%
Happy 28.3%
Sad 8.6%
Confused 5.3%
Disgusted 3.5%
Surprised 1.7%
Angry 1.4%
Fear 0.9%

AWS Rekognition

Age 21-29
Gender Male, 99.4%
Calm 50.2%
Happy 26.1%
Confused 14.2%
Disgusted 6.5%
Sad 1%
Surprised 0.9%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Male, 71%
Happy 64.4%
Sad 20.7%
Calm 8.8%
Angry 1.7%
Disgusted 1.3%
Confused 1.2%
Fear 1.2%
Surprised 0.7%

AWS Rekognition

Age 28-38
Gender Male, 91.3%
Calm 90.9%
Angry 4.9%
Sad 1.7%
Happy 1.1%
Surprised 0.6%
Disgusted 0.4%
Confused 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

31077
FLOYDS
Y139A8

Google

31077
31077