Human Generated Data

Title

Untitled (group of children gathered around board game)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7018

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of children gathered around board game)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.6
Person 99.6
Person 99.5
Person 97.4
Person 97
Person 96.9
Person 96.8
Person 94.9
Game 92.6
Person 88.1
Person 87.2
Person 84.4
Person 75.2
Gambling 70.4
People 62.5
Person 51.8

Imagga
created on 2021-12-15

senior 44
man 35.6
elderly 35.4
newspaper 33.6
people 33.5
male 33.3
person 29.9
retired 29.1
adult 27.7
product 26.3
old 25.8
mature 25.1
home 23.9
happy 23.2
retirement 22.1
creation 22
daily 21.9
computer 21.7
casual 20.3
laptop 20.2
portrait 20
couple 20
specialist 19.7
camera 19.4
indoors 19.3
looking 18.4
smiling 17.4
older 16.5
sitting 16.3
lifestyle 15.9
teacher 15.4
together 14.9
business 14.6
aged 14.5
husband 14.3
classroom 14
education 13.8
70s 13.8
pensioner 13.5
horizontal 13.4
happiness 13.3
businessman 13.2
day 12.5
room 12.5
nurse 12.5
age 12.4
office 12.2
face 12.1
men 12
hair 11.9
technology 11.9
gray hair 11.8
60s 11.7
gray 11.7
class 11.6
holding 11.5
professional 11.1
half length 10.7
hand 10.6
using 10.6
work 10.4
student 10.3
blackboard 10.2
glasses 10.2
smile 10
cheerful 9.7
look 9.6
table 9.6
aging 9.6
wife 9.5
enjoying 9.5
leisure 9.1
one 9
lady 8.9
group 8.9
to 8.8
browsing 8.8
teaching 8.8
clothing 8.8
active 8.7
desk 8.6
married 8.6
school 8.6
writing 8.6
money 8.5
meeting 8.5
finance 8.4
relaxed 8.4
modern 8.4
manager 8.4
color 8.3
alone 8.2
indoor 8.2
relaxing 8.2
women 7.9
love 7.9
facing camera 7.9
pension 7.9
grandmother 7.8
casual clothing 7.8
designer 7.7
architect 7.7
daytime 7.7
project 7.7
studying 7.7
serious 7.6
human 7.5
fun 7.5
emotion 7.4
grandma 7.4
joyful 7.3
grandfather 7.3
bright 7.1
family 7.1
working 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 99.6
text 99
statue 81.8
group 76.1
clothing 74.3
woman 58.8
human face 58.5
art 51.4

Face analysis

Amazon

Google

AWS Rekognition

Age 18-30
Gender Female, 71%
Sad 50.1%
Calm 36.2%
Fear 7.3%
Happy 2.2%
Confused 1.8%
Angry 1.7%
Surprised 0.5%
Disgusted 0.2%

AWS Rekognition

Age 31-47
Gender Female, 67.5%
Calm 79.2%
Sad 13.8%
Happy 4.1%
Angry 1.5%
Confused 0.5%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Female, 75.7%
Calm 94.7%
Sad 4.2%
Angry 0.4%
Confused 0.2%
Surprised 0.2%
Happy 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 24-38
Gender Female, 87.2%
Calm 65.2%
Happy 10.4%
Sad 9.3%
Surprised 8%
Confused 3.5%
Angry 2.7%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 26-42
Gender Female, 58.4%
Sad 72.1%
Calm 26.1%
Confused 1.4%
Happy 0.1%
Fear 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 22-34
Gender Female, 78%
Calm 88.1%
Sad 7.3%
Angry 2.5%
Surprised 0.9%
Fear 0.7%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Female, 52.1%
Sad 35.9%
Calm 28.3%
Happy 17.9%
Fear 8.3%
Angry 4.7%
Confused 1.9%
Surprised 1.6%
Disgusted 1.4%

AWS Rekognition

Age 22-34
Gender Female, 53.9%
Calm 77.5%
Happy 20%
Sad 1.8%
Angry 0.4%
Surprised 0.1%
Confused 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 27-43
Gender Female, 75.1%
Sad 95%
Calm 4.3%
Happy 0.4%
Confused 0.2%
Angry 0%
Fear 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 4-14
Gender Female, 73.1%
Calm 69.1%
Sad 19.8%
Happy 7.1%
Angry 2.3%
Disgusted 0.5%
Confused 0.5%
Fear 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people sitting at a table 89.3%
a group of people sitting around a table 89.2%
a group of people sitting on a table 83.7%

Text analysis

Amazon

22405
50e
NAGOR

Google

2.2405
2.2405