Human Generated Data

Title

Untitled (parents and four children on bed)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17064

Human Generated Data

Title

Untitled (parents and four children on bed)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17064

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.4
Human 98.4
Person 98.2
Person 97.9
Person 97
Person 96.6
Person 96.2
Interior Design 92.4
Indoors 92.4
Room 85.1
Bed 76
Furniture 76
People 73.6
Crowd 71.3
Meal 70.2
Food 70.2
Suit 65.8
Coat 65.8
Overcoat 65.8
Clothing 65.8
Apparel 65.8
Plant 65.7
Face 63.5
Photography 61.5
Photo 61.5
Clinic 59
Chair 59
Classroom 56.5
School 56.5
Audience 55.8
Tablecloth 55.6

Clarifai
created on 2023-10-28

people 99.9
group 99
adult 97.6
woman 97.4
man 96.6
child 94.8
group together 94.2
furniture 91.8
monochrome 90.3
administration 89.8
room 89.3
leader 89.3
chair 89
many 88.5
several 82
indoors 80.2
boy 79.2
music 79.1
sit 78.4
wedding 77.4

Imagga
created on 2022-02-26

people 26.2
man 24.2
male 24.1
person 21.9
silhouette 17.4
classroom 16.1
old 14.6
adult 14.2
business 13.4
blackboard 13.1
women 12.6
businessman 12.4
education 12.1
group 12.1
men 12
student 11.9
happy 11.9
room 11.8
vintage 11.6
black 11.4
art 11.3
design 11.2
portrait 11
teacher 10.9
office 10.7
child 10.4
drawing 10.4
school 10.1
symbol 10.1
hand 9.9
modern 9.8
human 9.7
computer 9.6
love 9.5
senior 9.4
grunge 9.4
two 9.3
event 9.2
retro 9
team 9
boy 8.7
smiling 8.7
smile 8.5
structure 8.4
negative 8.4
aged 8.1
film 8.1
professional 7.9
barbershop 7.9
work 7.8
couple 7.8
space 7.8
sitting 7.7
crowd 7.7
fun 7.5
positive 7.4
light 7.3
newspaper 7.3
indoor 7.3
graphic 7.3
paint 7.2
looking 7.2
creation 7.2
icon 7.1
happiness 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

indoor 85.6
person 83.4
text 81.1
wedding 77.4
clothing 71.8
human face 63.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 98%
Calm 39.2%
Sad 33.9%
Happy 12.3%
Confused 10.8%
Disgusted 1.3%
Angry 1.1%
Fear 0.9%
Surprised 0.6%

AWS Rekognition

Age 22-30
Gender Female, 91.9%
Happy 83.6%
Calm 10.1%
Surprised 3.2%
Confused 0.9%
Sad 0.7%
Disgusted 0.7%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 25-35
Gender Male, 66.6%
Sad 58%
Happy 33.8%
Calm 3.8%
Surprised 2.2%
Confused 0.7%
Angry 0.6%
Fear 0.6%
Disgusted 0.4%

AWS Rekognition

Age 31-41
Gender Female, 65.9%
Happy 97.5%
Calm 1.1%
Surprised 0.4%
Sad 0.3%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 95.9%
Happy 74.3%
Surprised 22.5%
Sad 0.9%
Angry 0.7%
Calm 0.5%
Fear 0.4%
Disgusted 0.3%
Confused 0.3%

AWS Rekognition

Age 45-53
Gender Male, 97.6%
Happy 87.5%
Surprised 6%
Angry 2%
Calm 1.6%
Confused 1.3%
Fear 0.6%
Disgusted 0.5%
Sad 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Bed
Person 98.4%
Person 98.2%
Person 97.9%
Person 97%
Person 96.6%
Person 96.2%
Bed 76%

Categories

Text analysis

Amazon

13

Google

13
13