Human Generated Data

Title

Untitled (formally dressed women sitting in convertable car)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7398

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed women sitting in convertable car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7398

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Person 98.9
Person 98.1
Person 97.4
Person 96.2
Person 94.2
Art 79.6
Text 74.3
Crowd 64.5
Vehicle 61.7
Transportation 61.7
People 60.3
Leisure Activities 59.4
Painting 55.5

Clarifai
created on 2023-10-25

people 99.8
group 97.9
man 97
adult 96.5
woman 95.5
administration 93.2
leader 92.8
veil 91
group together 90.3
many 90
wedding 88.6
ceremony 82.7
vehicle 77.7
musician 76.8
chair 76.6
several 76.2
wear 76.2
music 75.3
crowd 74.5
interaction 74.2

Imagga
created on 2022-01-08

groom 21.3
person 21.1
old 19.5
man 18.1
newspaper 17.8
grunge 16.2
vintage 15.7
black 15.6
people 15.6
product 14.2
life 13.7
world 13.4
silhouette 13.2
art 12.4
male 11.3
fan 11.1
creation 11
light 10.8
color 10.6
grungy 10.4
building 10.4
antique 10.4
paint 10
dirty 9.9
retro 9.8
texture 9.7
businessman 9.7
wall 9.4
space 9.3
power 9.2
outdoor 9.2
frame 9.2
business 9.1
history 8.9
sky 8.9
text 8.7
water 8.7
men 8.6
room 8.4
smoke 8.4
city 8.3
fountain 8.3
alone 8.2
follower 8.2
border 8.1
graphic 8
negative 7.9
weathered 7.6
nuclear weapon 7.6
canvas 7.6
pattern 7.5
sport 7.5
style 7.4
activity 7.2
women 7.1
summer 7.1
creative 7.1
architecture 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.7
drawing 84.7
person 79.2
clothing 76.3
old 71.3
sketch 65.7
woman 55.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Male, 84.7%
Calm 48.7%
Happy 40.9%
Fear 3.4%
Surprised 2.5%
Disgusted 1.6%
Sad 1.3%
Angry 0.9%
Confused 0.7%

AWS Rekognition

Age 25-35
Gender Female, 75.9%
Calm 97.8%
Happy 0.7%
Surprised 0.4%
Sad 0.4%
Fear 0.3%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 33-41
Gender Female, 61.5%
Calm 93.2%
Happy 2.2%
Confused 1.2%
Surprised 1.2%
Sad 1%
Angry 0.5%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Female, 94.2%
Sad 97.4%
Fear 1.8%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Calm 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Male, 69.6%
Confused 74.1%
Calm 17.5%
Fear 2.2%
Sad 1.8%
Angry 1.3%
Disgusted 1.3%
Surprised 1.3%
Happy 0.6%

AWS Rekognition

Age 21-29
Gender Male, 97.8%
Surprised 63.1%
Sad 21.3%
Calm 7.3%
Disgusted 3.7%
Confused 2.3%
Angry 1%
Fear 1%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 90.3%
nature landscape 7.8%

Captions

Microsoft
created on 2022-01-08

an old photo of a person 85.7%
an old photo of a group of people 85.5%
old photo of a person 84.2%

Text analysis

Amazon

15934

Google

15934· Is934.
15934·
Is934.