Human Generated Data

Title

Untitled (Fanling Babies Home, Hong Kong)

Date

March 3, 1960-March 13, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2313

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fanling Babies Home, Hong Kong)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 3, 1960-March 13, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2313

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Baby 98.4
Person 98.4
Person 98.4
Child 98.4
Female 98.4
Girl 98.4
Baby 97.4
Person 97.4
Baby 96.3
Person 96.3
Baby 96
Person 96
Baby 95.9
Person 95.9
Baby 95.6
Person 95.6
Face 95.4
Head 95.4
Person 94.8
Architecture 93.9
Building 93.9
School 93.9
Baby 92.5
Person 92.5
Person 92.2
Person 91.3
Classroom 86.5
Indoors 86.5
Room 86.5
Person 86.3
Child 86.3
Boy 86.3
Male 86.3
Person 84.5
Person 68.1
Furniture 57.3
Student 57
Table 56.4
Kindergarten 56.4

Clarifai
created on 2018-05-10

child 99.9
education 99.5
people 99.3
school 98.7
group 98.5
elementary school 97.7
classroom 97.7
boy 94.8
adult 94.1
woman 91.9
many 91.7
room 91
furniture 90.9
sit 90.5
teacher 90.2
indoors 88.2
son 87.7
wear 86.9
offspring 84.6
facial expression 84.6

Imagga
created on 2023-10-06

nurse 35.3
child 34.5
man 33.6
people 33.5
classroom 32.8
male 31.6
home 30.3
indoors 29
couple 28.8
hospital 27.5
happy 27
together 25.4
smiling 23.2
room 22.9
happiness 22.7
family 22.3
person 22.2
love 22.1
adult 21
mother 20.9
sitting 19.8
lifestyle 19.5
boy 19.1
husband 17.5
group 16.9
adults 16.1
father 15.7
children 15.5
relaxing 15.5
casual 15.3
computer 15.3
women 15
daughter 14.9
education 14.7
laptop 14.7
student 14.5
parent 13.7
two 13.6
boyfriend 13.5
girlfriend 13.5
30s 13.5
lying 13.2
relationship 13.1
sibling 12.8
spa 12.6
day 12.6
friends 12.2
cheerful 12.2
friendship 12.2
smile 12.1
20s 11.9
teacher 11.7
color 11.7
couch 11.6
class 11.6
holding 11.6
wife 11.4
togetherness 11.3
son 11.3
enjoyment 11.3
fun 11.2
school 11
leisure 10.8
brown hair 10.8
childhood 10.8
students 10.7
half length 10.7
two people 10.7
kid 10.6
massage 10.6
attractive 10.5
bed 10.4
looking 10.4
portrait 10.4
men 10.3
baby 10.1
life 9.8
casual clothing 9.8
thirties 9.7
brunette 9.6
living 9.5
relaxation 9.2
treatment 9.2
back 9.2
indoor 9.1
girls 9.1
dad 9
technology 8.9
sofa 8.8
daytime 8.7
work 8.6
friend 8.6
females 8.5
bedroom 8.5
communication 8.4
horizontal 8.4
book 8.2
care 8.2
playing 8.2
team 8.1
romantic 8
little 8
fond 7.9
20 24 years 7.9
clinic 7.8
affectionate 7.8
hug 7.8
patient 7.7
comfort 7.7
mid adult 7.7
loving 7.6
reading 7.6
talking 7.6
chair 7.6
desk 7.6
kids 7.5
house 7.5
coffee 7.4
inside 7.4
professional 7.3
occupation 7.3
cute 7.2
clothing 7.2
romance 7.1
medical 7.1
pretty 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 100
sitting 97.1
indoor 96.8
child 94.9
group 60
family 19.9
crowd 1.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-4
Gender Female, 67%
Sad 95.9%
Fear 61.8%
Surprised 6.3%
Calm 1.2%
Angry 0.7%
Happy 0.4%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 0-3
Gender Female, 99.6%
Calm 82.7%
Sad 8.2%
Fear 6.7%
Surprised 6.3%
Confused 4.8%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 0-4
Gender Male, 99.2%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 0-4
Gender Male, 88.5%
Calm 95%
Surprised 6.6%
Fear 6.1%
Sad 2.7%
Happy 0.9%
Confused 0.7%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 1-7
Gender Male, 79.5%
Calm 85.5%
Fear 6.7%
Surprised 6.4%
Sad 6.2%
Angry 1.9%
Happy 1.2%
Confused 1%
Disgusted 0.2%

AWS Rekognition

Age 0-4
Gender Male, 64.5%
Fear 51.7%
Surprised 45.3%
Happy 8.4%
Sad 7.7%
Calm 6.5%
Disgusted 1.9%
Angry 1.1%
Confused 1.1%

AWS Rekognition

Age 0-6
Gender Female, 99%
Sad 79.2%
Fear 43.4%
Calm 22%
Surprised 6.4%
Happy 3.2%
Confused 0.5%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 0-3
Gender Female, 95%
Angry 49.9%
Calm 19.4%
Fear 12.2%
Surprised 12%
Sad 4.6%
Happy 2.3%
Confused 1.4%
Disgusted 1.2%

AWS Rekognition

Age 0-4
Gender Male, 76.1%
Fear 79.2%
Sad 46%
Surprised 6.8%
Calm 5.9%
Angry 3%
Happy 1.5%
Disgusted 1%
Confused 0.9%

AWS Rekognition

Age 4-10
Gender Female, 66.4%
Calm 72.8%
Sad 47.9%
Surprised 6.3%
Fear 6%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 10-18
Gender Male, 91.7%
Calm 37.4%
Fear 25.3%
Sad 16.7%
Surprised 7.5%
Confused 7.4%
Disgusted 5.7%
Happy 4.2%
Angry 2.3%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Baby 98.4%
Person 98.4%
Child 98.4%
Female 98.4%
Girl 98.4%
Boy 86.3%
Male 86.3%

Categories

Imagga

interior objects 88.5%
people portraits 10.8%