Human Generated Data

Title

Untitled (men on stage at starred podium)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10561

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men on stage at starred podium)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10561

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.9
Human 98.9
Person 98.2
Person 92.7
Person 80.1
Person 79.3
Person 77.2
Text 76.2
Amusement Park 72.7
Theme Park 72.3
Crowd 72.2
Person 72
Person 60.6
Musician 60
Musical Instrument 60
Silhouette 56.9
People 56.5
Poster 56.5
Advertisement 56.5

Clarifai
created on 2023-10-25

people 98.6
wear 95.1
group 93.8
illustration 93.7
movie 93.2
many 92.2
monochrome 91.4
man 91
art 89.5
music 89.4
adult 89.4
Halloween 88
audience 84.2
costume 81.7
woman 81.3
group together 81.2
vehicle 76.6
moon 75.9
retro 75.9
design 73.9

Imagga
created on 2022-01-09

city 23.3
building 21.2
architecture 21.2
barbershop 20.2
shop 20
car 16.4
old 16
urban 15.7
freight car 15.7
mercantile establishment 15.2
structure 14.6
vehicle 13.6
wheeled vehicle 13
ancient 13
travel 12
street 12
grunge 11.9
buildings 11.3
billboard 11
place of business 10.1
house 10
black 9.6
antique 9.5
art 9.4
town 9.3
signboard 8.9
sky 8.9
night 8.9
window 8.6
old fashioned 8.6
vintage 8.3
tourism 8.2
technology 8.2
aged 8.1
dirty 8.1
history 8
light 8
decoration 7.7
cityscape 7.6
dark 7.5
monument 7.5
design 7.4
historic 7.3
truck 7.3
people 7.2
paint 7.2
landmark 7.2
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.6
black and white 88.2
cartoon 83.8
drawing 78.9
clothing 74.9
person 68.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 70.7%
Surprised 81.6%
Calm 5.9%
Confused 4.1%
Fear 3.7%
Sad 2.2%
Disgusted 1.5%
Angry 0.7%
Happy 0.4%

AWS Rekognition

Age 37-45
Gender Male, 99.7%
Calm 69.5%
Disgusted 11.3%
Happy 7.1%
Sad 4.4%
Angry 4.1%
Fear 1.3%
Confused 1.3%
Surprised 0.9%

AWS Rekognition

Age 20-28
Gender Male, 62.5%
Calm 95%
Sad 1.1%
Angry 1%
Fear 0.9%
Disgusted 0.8%
Confused 0.6%
Surprised 0.3%
Happy 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Disgusted 42%
Angry 16%
Surprised 15.4%
Calm 9.3%
Confused 8.4%
Sad 5.5%
Fear 1.8%
Happy 1.6%

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Happy 33.6%
Sad 30.6%
Calm 9%
Fear 8.5%
Confused 5.4%
Angry 5.1%
Surprised 4%
Disgusted 3.9%

AWS Rekognition

Age 33-41
Gender Male, 95.4%
Calm 76.3%
Sad 9.4%
Happy 5.5%
Angry 2.5%
Disgusted 2.2%
Fear 1.7%
Surprised 1.6%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Captions

Microsoft
created on 2022-01-09

a sign above a store 43.9%
a photo of a person 38.4%

Text analysis

Amazon

Ford
UNUM
20869.
20869
YT33AC

Google

20869 • M Ford 20869.
20869
20869.
M
Ford