Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10922

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10922

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Chair 99.5
Furniture 99.5
Person 98.6
Person 98
Person 98
Person 97.1
Person 97
Person 94.9
Person 93.1
Face 89.7
Dress 88.6
Gown 88.5
Fashion 88.5
Robe 87
Female 83.7
Indoors 82.3
Couch 81.4
Wedding 80.8
People 77.6
Table 75.5
Suit 74.8
Coat 74.8
Overcoat 74.8
Home Decor 74.2
Living Room 72.9
Room 72.9
Wedding Gown 72.4
Woman 70
Portrait 68.7
Photography 68.7
Photo 68.7
Baby 68.6
Person 68.3
Person 65.1
Kid 64.4
Child 64.4
Meal 64
Food 64
Bed 63.2
Plant 60.7
Bride 58.6
Smile 56.9
Girl 56.4

Clarifai
created on 2023-10-29

people 99.8
group 98.9
man 98.4
group together 98
child 97.4
adult 97.3
woman 96.8
education 95.3
leader 93.6
many 89.9
indoors 86.5
boy 84.1
sit 83.7
school 82.7
musician 81.1
league 80.9
chair 78.8
uniform 78.1
room 77.7
meeting 77.6

Imagga
created on 2022-02-05

person 42.2
nurse 38
people 35.1
man 31.6
male 30.5
room 28.9
adult 28.6
patient 28.5
home 26.3
indoors 24.6
women 21.4
smiling 21
couple 20.9
teacher 20
happy 19.4
table 18.2
men 18
sitting 17.2
meeting 17
professional 16.8
business 16.4
hospital 16.2
senior 15.9
businessman 15.9
together 15.8
talking 15.2
happiness 14.9
case 14.6
office 14.6
businesswoman 14.5
group 14.5
smile 14.3
interior 14.2
sick person 13.9
indoor 13.7
family 13.3
businesspeople 13.3
desk 13.2
20s 12.8
worker 12.5
grandfather 12.4
portrait 12.3
cheerful 12.2
work 11.8
colleagues 11.7
team 11.6
30s 11.5
mature 11.2
casual 11
medical 10.6
corporate 10.3
mother 10.2
teamwork 10.2
educator 10.1
life 9.8
health 9.7
elderly 9.6
lifestyle 9.4
two 9.3
inside 9.2
chair 9.2
clothing 9.1
modern 9.1
lady 8.9
working 8.8
child 8.8
two people 8.8
mid adult 8.7
student 8.6
doctor 8.5
old 8.4
house 8.4
laptop 8.2
board 8.1
new 8.1
suit 8.1
kid 8
job 8
executive 8
lab coat 7.8
casual clothing 7.8
classroom 7.8
40s 7.8
couch 7.7
attractive 7.7
husband 7.6
enjoying 7.6
coat 7.5
holding 7.4
occupation 7.3
color 7.2
dress 7.2
looking 7.2
waiter 7.1
love 7.1
to 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

wedding dress 90.9
person 89.2
text 86.8
bride 81.4
woman 77.4
clothing 75.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 93.7%
Calm 72.3%
Sad 14.1%
Happy 8.2%
Angry 1.3%
Fear 1.2%
Surprised 1.1%
Confused 1%
Disgusted 0.9%

AWS Rekognition

Age 31-41
Gender Male, 75.4%
Calm 98.9%
Sad 0.4%
Happy 0.3%
Fear 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Sad 73.4%
Calm 10.4%
Surprised 6%
Fear 3.3%
Confused 2.7%
Angry 1.8%
Disgusted 1.5%
Happy 0.9%

AWS Rekognition

Age 26-36
Gender Male, 97.2%
Calm 58.4%
Sad 26.3%
Angry 9.1%
Confused 2.2%
Disgusted 1.4%
Fear 1.2%
Happy 0.8%
Surprised 0.5%

AWS Rekognition

Age 29-39
Gender Male, 90.2%
Calm 82.8%
Surprised 8.8%
Happy 2.2%
Fear 1.9%
Sad 1.7%
Angry 1%
Disgusted 0.8%
Confused 0.7%

AWS Rekognition

Age 29-39
Gender Male, 56.3%
Calm 96.4%
Happy 2.7%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%
Sad 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 52-60
Gender Female, 81.9%
Calm 99.5%
Sad 0.3%
Happy 0.1%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 31-41
Gender Male, 75.8%
Calm 93.6%
Sad 4.5%
Happy 0.9%
Confused 0.4%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 31-41
Gender Female, 98.3%
Surprised 43.6%
Calm 38.6%
Fear 12.4%
Happy 2.1%
Disgusted 1.4%
Sad 0.8%
Angry 0.6%
Confused 0.6%

AWS Rekognition

Age 27-37
Gender Male, 98.2%
Calm 97.4%
Sad 2%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Couch
Person 99.6%
Person 98.6%
Person 98%
Person 98%
Person 97.1%
Person 97%
Person 94.9%
Person 93.1%
Person 68.3%
Person 65.1%
Couch 81.4%

Categories

Text analysis

Amazon

A3DA
MJ131 A3DA
MJ131

Google

VCEV 2VEEIA EIA
VCEV
2VEEIA
EIA