Human Generated Data

Title

Untitled (parents sitting on porch with three children)

Date

1944

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7244

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (parents sitting on porch with three children)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7244

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.7
Human 98.7
Clothing 98.5
Apparel 98.5
Person 96.8
Person 95.8
Railing 85.6
Furniture 79.8
Long Sleeve 71.3
Sleeve 71.3
Person 66.3
Handrail 65.9
Banister 65.9
Shirt 65.4
Suit 59.6
Coat 59.6
Overcoat 59.6
Porch 56.6

Clarifai
created on 2023-10-25

people 99.8
group 99
adult 97.3
woman 97.3
man 96.6
group together 96.4
actor 94.1
three 92.5
monochrome 92.4
administration 92.1
leader 90.5
actress 90.1
four 89.4
several 88.6
child 87.2
facial expression 84
two 81.7
singer 78.4
theater 77.4
wedding 76.6

Imagga
created on 2022-01-08

parallel bars 25.6
man 24.8
people 24
gymnastic apparatus 21.6
business 20
musical instrument 19.9
male 19.1
window 18
barrier 16.5
railing 16.4
sports equipment 16.3
balcony 15.8
couple 14.8
person 14.8
percussion instrument 14.5
office 14.5
interior 14.1
businessman 14.1
modern 14
adult 13.9
happy 13.1
architecture 12.5
silhouette 12.4
structure 12.4
obstruction 11.9
men 11.2
women 11.1
black 10.8
building 10.6
working 10.6
travel 10.6
urban 10.5
passenger 10.5
outdoors 10.4
portrait 10.3
day 10.2
smiling 10.1
equipment 10.1
family 9.8
gate 9.5
sitting 9.4
outside 9.4
lifestyle 9.4
two 9.3
house 9.2
city 9.1
indoor 9.1
brass 8.9
marimba 8.9
together 8.8
happiness 8.6
smile 8.5
worker 8.4
finance 8.4
old 8.4
life 8.2
park 8.2
room 8
wind instrument 8
work 7.8
airport 7.8
standing 7.8
glass 7.8
corporate 7.7
wall 7.7
pretty 7.7
walk 7.6
fashion 7.5
meeting 7.5
professional 7.5
inside 7.4
back 7.3
suit 7.2
history 7.1
job 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

statue 91
clothing 90.7
person 87.9
outdoor 86.8
text 86.7
black and white 80.7
man 75.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 89.8%
Calm 94.8%
Happy 1.8%
Sad 1.2%
Disgusted 0.6%
Surprised 0.6%
Confused 0.4%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Calm 88.5%
Confused 7.4%
Sad 2.7%
Angry 0.4%
Happy 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 21-29
Gender Female, 75.4%
Sad 41.5%
Calm 33.3%
Happy 18.3%
Fear 2.3%
Confused 1.5%
Disgusted 1.2%
Angry 1%
Surprised 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Text analysis

Amazon

21690.
KODAR
TOU
EVERYA KODAR
EVERYA

Google

21690.
21690.