Human Generated Data

Title

Untitled (people looking at the side of a small boat, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8516

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people looking at the side of a small boat, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Jury 99.6
Person 99.3
Human 99.3
Room 99.3
Indoors 99.3
Person 99.2
Person 99.1
Person 99
Person 96.4
Person 95.3
Court 91.7
Person 87.3
Person 87
Person 86.9
Animal 79.1
Bird 79.1
Person 73.5
Sphere 55.1

Imagga
created on 2022-01-09

man 33
people 30.7
grand piano 29.8
person 26.6
piano 24.5
male 23.4
business 23.1
businessman 22.9
percussion instrument 22.5
blackboard 22.5
teacher 21.5
adult 20.1
stringed instrument 19.5
keyboard instrument 18.7
group 18.5
classroom 18.3
education 17.3
musical instrument 17
room 16.6
office 15.7
happy 15.7
indoor 15.5
student 15.4
class 15.4
job 15
board 14.5
team 14.3
television 14.3
women 14.2
work 14.1
executive 13.8
corporate 13.7
school 13.6
desk 13.5
sitting 12.9
table 12.7
crowd 12.5
meeting 12.2
smile 12.1
modern 11.9
smiling 11.6
hand 11.5
indoors 11.4
couple 11.3
boy 11.3
men 11.2
laptop 11.1
businesswoman 10.9
lifestyle 10.8
audience 10.7
chair 10.4
career 10.4
study 10.3
black 10.2
lights 10.2
communication 10.1
child 10
silhouette 9.9
cheering 9.8
nighttime 9.8
corporation 9.6
looking 9.6
design 9.6
businesspeople 9.5
love 9.5
manager 9.3
portrait 9.1
computer 8.9
color 8.9
teaching 8.8
stadium 8.8
symbol 8.7
professional 8.7
leader 8.7
patriotic 8.6
happiness 8.6
exam 8.6
nation 8.5
adults 8.5
two 8.5
flag 8.4
holding 8.3
interior 8
vibrant 7.9
bright 7.9
hands 7.8
boss 7.6
tie 7.6
human 7.5
teamwork 7.4
cheerful 7.3
suit 7.3
lady 7.3
confident 7.3
university 7.2
icon 7.1
day 7.1

Google
created on 2022-01-09

Font 82.3
Adaptation 79.3
Monochrome 72.7
Monochrome photography 72.6
Event 70.1
Suit 70
Photo caption 67.6
Room 63.5
History 62.5
Stock photography 62.3
Rectangle 59.2
Circle 53.9
Crew 52.9
Team 52.7

Microsoft
created on 2022-01-09

text 99.3
indoor 92.3
black and white 87.8
human face 67.8
person 65.8
monochrome 54.6
man 54.5

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 60%
Happy 64.4%
Calm 26%
Confused 3.1%
Surprised 2.7%
Sad 2.6%
Disgusted 0.5%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 52-60
Gender Male, 69.4%
Calm 95.9%
Sad 3%
Happy 0.3%
Confused 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 28-38
Gender Female, 58.3%
Calm 50.8%
Fear 15%
Sad 13.3%
Angry 7.3%
Happy 5.3%
Confused 3.4%
Surprised 3%
Disgusted 1.9%

AWS Rekognition

Age 31-41
Gender Male, 74.8%
Calm 78.6%
Happy 7.4%
Surprised 3.4%
Confused 3.2%
Angry 2.9%
Fear 1.6%
Disgusted 1.5%
Sad 1.3%

AWS Rekognition

Age 24-34
Gender Male, 95.9%
Calm 99.9%
Sad 0%
Happy 0%
Angry 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 99.1%
Calm 97.3%
Happy 1.7%
Sad 0.5%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 92.8%
Calm 87.1%
Sad 5.7%
Confused 2.7%
Angry 1.3%
Fear 1.3%
Surprised 0.8%
Happy 0.7%
Disgusted 0.4%

AWS Rekognition

Age 9-17
Gender Female, 53.1%
Calm 77.9%
Sad 16.2%
Happy 1.5%
Fear 1.3%
Angry 1.1%
Confused 0.8%
Surprised 0.7%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Bird 79.1%

Captions

Microsoft

a man sitting in front of a television 55.4%
an old photo of a man 55.3%
a man sitting in front of a window 50.1%

Text analysis

Amazon

17349.
RV

Google

173
17349.
173 49. 17349.
49.