Human Generated Data

Title

Untitled (children seated around decorated table)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8408

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children seated around decorated table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8408

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 98.5
Person 97.4
Person 97.1
Person 94.7
Person 87.2
Machine 74.4
Person 72.2
People 71.3
Room 64.7
Indoors 64.7
Workshop 64.6
Clinic 61
Musician 58.2
Musical Instrument 58.2
Leisure Activities 57.9
Person 43.8

Clarifai
created on 2023-10-25

people 99.7
group 99.5
woman 97.6
group together 96.8
child 96.3
man 96.3
adult 94.7
leader 94.5
medical practitioner 92.7
sit 92.3
administration 89.2
family 89
sitting 88.2
education 88
many 87.7
chair 86.8
war 84.3
music 82.7
vehicle 82.5
several 82.1

Imagga
created on 2022-01-09

person 47.1
people 32.3
man 31.6
adult 27.5
male 26.3
home 24.7
indoors 23.7
room 23
professional 21
teacher 20.6
women 19.8
smiling 18.8
happy 18.8
patient 18.5
work 17.3
men 16.3
cheerful 16.2
life 16.2
sitting 14.6
house 14.2
happiness 14.1
indoor 13.7
portrait 13.6
business 13.4
businessman 13.2
smile 12.8
chair 12.6
team 12.5
interior 12.4
music 11.8
brass 11.6
lifestyle 11.6
couple 11.3
office 11.2
modern 11.2
casual 11
nurse 10.8
working 10.6
medical 10.6
wind instrument 10.5
looking 10.4
education 10.4
worker 9.9
lady 9.7
musical instrument 9.6
boy 9.6
table 9.5
instrument 9.5
meeting 9.4
senior 9.4
student 9.3
classroom 9.3
leisure 9.1
businesswoman 9.1
guitar 9.1
board 9
educator 9
human 9
fun 9
kitchen 8.9
handsome 8.9
family 8.9
case 8.9
group 8.9
sick person 8.9
to 8.8
check 8.7
rock 8.7
30s 8.7
exam 8.6
hospital 8.5
attractive 8.4
musician 8.4
health 8.3
player 8.3
equipment 8.3
inside 8.3
girls 8.2
domestic 8.1
bass 8.1
new 8.1
job 8
planner 7.9
clothing 7.9
together 7.9
husband 7.8
guy 7.8
black 7.8
performer 7.8
concert 7.8
play 7.8
musical 7.7
businesspeople 7.6
fashion 7.5
mature 7.4
holding 7.4
teamwork 7.4
child 7.4
occupation 7.3
alone 7.3
dress 7.2
activity 7.2
mother 7.1
love 7.1
executive 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.5
person 89.3
clothing 79.5
black and white 53.2
posing 46.8
old 43.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 64.1%
Calm 87.4%
Surprised 3.9%
Sad 3.7%
Angry 1.3%
Confused 1.3%
Happy 1%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 18-24
Gender Male, 55.9%
Sad 52.6%
Calm 37.4%
Angry 4.7%
Confused 1.3%
Disgusted 1.2%
Happy 1.2%
Surprised 1%
Fear 0.8%

AWS Rekognition

Age 18-24
Gender Male, 99.9%
Sad 51.4%
Calm 31.6%
Angry 4.6%
Confused 3.4%
Surprised 3.2%
Fear 2.2%
Happy 2.2%
Disgusted 1.6%

AWS Rekognition

Age 35-43
Gender Male, 79.4%
Surprised 67.4%
Calm 16.6%
Angry 6.5%
Fear 3.7%
Happy 2.4%
Sad 1.7%
Confused 1%
Disgusted 0.6%

AWS Rekognition

Age 21-29
Gender Male, 64.6%
Surprised 99.6%
Calm 0.3%
Fear 0%
Confused 0%
Sad 0%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 42-50
Gender Female, 97.1%
Calm 76.5%
Happy 19.1%
Confused 1.5%
Surprised 1%
Disgusted 0.7%
Fear 0.6%
Sad 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Text analysis

Amazon

12081.
12081

Google

12081. 12081. 12081 YAGON-YT3RA2-AMTZA
12081.
YAGON-YT3RA2-AMTZA
12081