Human Generated Data

Title

Untitled (circus performer embracing man in street)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5053

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performer embracing man in street)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5053

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.4
Human 99.4
Person 96.3
Person 93.6
Person 90.6
Crowd 89.4
Person 86.2
Person 83.2
People 81.6
Person 77.9
Person 75.6
Person 66.4
Audience 58.6
Person 42.9

Clarifai
created on 2023-10-26

people 99.2
many 98.5
crowd 96
man 95.8
adult 95
monochrome 94.2
group 91.3
street 90.4
woman 89.9
group together 81.7
war 79.9
no person 76.5
dining 75.6
child 75.4
furniture 74.6
outdoors 74.5
old 74
table 72.6
restaurant 71.5
sit 70

Imagga
created on 2022-01-22

sketch 79.8
drawing 67.2
representation 45.4
business 23.1
grunge 21.3
architecture 21.1
design 18.6
building 18.2
construction 17.1
plan 17
city 16.6
art 15.2
graphic 14.6
wagon 14.3
modern 14
hall 13.3
map 12.7
structure 12.5
retro 12.3
urban 12.2
office 12.2
technology 11.9
finance 11.8
silhouette 11.6
project 11.5
wheeled vehicle 11.1
pattern 10.9
house 10.9
space 10.9
3d 10.8
vintage 10.8
architect 10.6
style 10.4
center 10
tracing 9.9
idea 9.8
sign 9.8
businessman 9.7
success 9.7
paper 9.5
cityscape 9.5
work 9.4
old 9.1
tower 9
sky 8.9
digital 8.9
development 8.8
symbol 8.8
scene 8.7
diagram 8.6
industry 8.5
travel 8.4
horizontal 8.4
dollar 8.4
shape 8.2
dirty 8.1
lines 8.1
growth 7.9
color 7.8
graph 7.7
chart 7.6
texture 7.6
frame 7.6
engineering 7.6
pencil 7.6
skyline 7.6
town 7.4
equipment 7.4
investment 7.3
antique 7.3
detail 7.2
computer 7.2
black 7.2
copy 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.2
drawing 81
white 71.2
people 70.4
old 68.7
black 67
person 59.9
group 59
clothing 55.4
posing 36
crowd 0.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 87.9%
Disgusted 92.7%
Calm 3.4%
Sad 1.8%
Angry 0.9%
Happy 0.6%
Confused 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 92.1%
Calm 39.3%
Sad 18.8%
Confused 12.3%
Disgusted 12.1%
Happy 10.3%
Angry 3.6%
Surprised 2.5%
Fear 1.1%

AWS Rekognition

Age 22-30
Gender Male, 63.1%
Fear 27.2%
Sad 22.9%
Disgusted 14.2%
Calm 12.7%
Angry 8.5%
Happy 8%
Surprised 3.3%
Confused 3.3%

AWS Rekognition

Age 27-37
Gender Male, 87.7%
Calm 43.9%
Sad 26.7%
Happy 24.5%
Angry 2.9%
Disgusted 0.8%
Confused 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 24-34
Gender Male, 99.1%
Calm 91.1%
Surprised 3.5%
Happy 2.4%
Sad 0.9%
Fear 0.8%
Angry 0.8%
Disgusted 0.5%
Confused 0.1%

AWS Rekognition

Age 25-35
Gender Female, 57.7%
Calm 95.4%
Fear 1.1%
Disgusted 1.1%
Sad 1%
Happy 1%
Angry 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 24-34
Gender Male, 92%
Calm 61.5%
Happy 30.9%
Fear 2.7%
Sad 2.7%
Confused 0.7%
Disgusted 0.7%
Angry 0.5%
Surprised 0.3%

AWS Rekognition

Age 23-31
Gender Male, 60%
Calm 62%
Happy 17%
Sad 13.2%
Angry 2.8%
Fear 2.1%
Disgusted 2.1%
Surprised 0.5%
Confused 0.3%

AWS Rekognition

Age 21-29
Gender Female, 80.8%
Sad 46.9%
Calm 43.4%
Confused 3%
Disgusted 2.4%
Angry 1.8%
Fear 1.3%
Surprised 0.8%
Happy 0.5%

AWS Rekognition

Age 23-33
Gender Male, 56.6%
Confused 39.5%
Calm 28.3%
Sad 13%
Surprised 6.5%
Happy 4.2%
Fear 3.8%
Angry 3.4%
Disgusted 1.4%

AWS Rekognition

Age 26-36
Gender Male, 96.6%
Calm 37%
Happy 35.5%
Sad 15.9%
Angry 5.5%
Fear 2%
Disgusted 1.8%
Surprised 1.6%
Confused 0.7%

AWS Rekognition

Age 31-41
Gender Male, 57.7%
Happy 90.2%
Calm 5.1%
Sad 2.8%
Angry 0.6%
Fear 0.4%
Surprised 0.3%
Disgusted 0.3%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Male, 94.5%
Calm 96.4%
Sad 2%
Happy 0.6%
Angry 0.5%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 24-34
Gender Male, 96.2%
Calm 70.9%
Happy 24.2%
Sad 2.5%
Confused 0.7%
Angry 0.6%
Fear 0.4%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 25-35
Gender Female, 80.7%
Calm 96.2%
Disgusted 2.5%
Sad 0.4%
Angry 0.4%
Happy 0.2%
Confused 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 74.5%
Calm 97.3%
Sad 0.8%
Happy 0.5%
Disgusted 0.4%
Fear 0.4%
Angry 0.3%
Confused 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 89.8%
text visuals 8.7%
beaches seaside 1.2%

Text analysis

Amazon

31084
Station
AIRLINE
SEABOARD AIRLINE RAILROAD
RAILROAD
SEABOARD
ST
MAIN ST
MAIN
Passenger Station
5
Passenger
YO

Google

SEABOARD AIRLINE RAILROAD Passenger Station 31084 MAGO
SEABOARD
AIRLINE
RAILROAD
Passenger
Station
31084
MAGO