Human Generated Data

Title

Untitled (three women and a man sitting on ivy-covered wall)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8329

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women and a man sitting on ivy-covered wall)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8329

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.2
Person 99.1
Person 98.6
Apparel 98.4
Clothing 98.4
Face 89.2
Outdoors 83.9
Nature 78.2
Hat 76.2
Portrait 64.6
Photography 64.6
Photo 64.6
Suit 59.5
Coat 59.5
Overcoat 59.5
Countryside 59.3
Plant 55.7
Standing 55.2
Soil 55.1

Clarifai
created on 2023-10-25

people 99.8
adult 97
man 96.9
group 96.3
group together 96.1
child 92.9
woman 92.6
leader 91.9
three 91.7
two 90.4
administration 90.1
outfit 88.8
wear 88.6
soldier 88.2
military 88
war 87.2
five 86.7
four 82.2
vehicle 78.8
police 75.5

Imagga
created on 2022-01-09

man 27.6
person 27.2
outdoor 26
male 24.8
people 21.2
sky 17.9
landscape 16.4
outdoors 14.2
adult 13
world 12.6
silhouette 12.4
men 12
old 11.8
park 11.5
businessman 11.5
active 10.9
field 10.9
sun 10.5
cloud 10.3
grass 10.3
outside 10.3
life 10.3
day 10.2
alone 10
happy 10
business 9.7
summer 9.6
happiness 9.4
natural 9.4
environment 9
mountain 8.9
success 8.9
work 8.8
autumn 8.8
forest 8.7
fog 8.7
walking 8.5
free 8.4
manager 8.4
technology 8.2
stone 8.1
rural 7.9
hike 7.8
season 7.8
space 7.8
hiking 7.7
relax 7.6
sign 7.5
vintage 7.4
freedom 7.3
danger 7.3
sunset 7.2
activity 7.2
farm 7.1
love 7.1
job 7.1
working 7.1
child 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.8
clothing 92.7
black and white 90.7
outdoor 90.3
person 90
man 74

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Calm 46.5%
Happy 27.3%
Confused 10.6%
Surprised 6.7%
Disgusted 3%
Fear 2.6%
Sad 1.9%
Angry 1.3%

AWS Rekognition

Age 35-43
Gender Male, 90.6%
Calm 96.7%
Happy 2.5%
Sad 0.2%
Confused 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 41-49
Gender Male, 100%
Sad 49.7%
Calm 38.9%
Confused 4.1%
Happy 3.7%
Angry 1.7%
Surprised 1.2%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 98.9%
Happy 56.2%
Fear 24.4%
Calm 13.3%
Sad 1.9%
Surprised 1.2%
Confused 1.1%
Disgusted 1%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Hat 76.2%

Categories

Captions

Microsoft
created on 2022-01-09

an old photo of a person 88.9%
an old photo of a girl 79.2%
an old photo of a boy 65.4%

Text analysis

Amazon

9884
SI
9884.
AROA
2 2 3 9 9 AROA
2 2 3 9 9

Google

988 ч. 9884 4884. JS
988
ч.
9884
4884.
JS