Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10921

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.4
Person 99.4
Apparel 99.3
Clothing 99.3
Person 99
Person 98.1
Person 98
Person 97.7
Person 96
Furniture 91.4
Person 90.7
People 88.9
Person 83
Person 80.5
Fashion 78.6
Gown 78.6
Robe 78.1
Wedding 71.6
Female 66.8
Chair 65
Wedding Gown 64.7
Indoors 61
Person 59.5
Coat 58.7
Suit 58.7
Overcoat 58.7
Room 58
Dress 56
Family 55.3

Imagga
created on 2022-02-05

person 44.6
man 37.7
adult 36.2
male 32.6
people 32.4
teacher 30.3
nurse 29.6
professional 28.1
room 24.7
patient 24.3
business 24.3
businessman 23.8
home 22.3
happy 21.9
meeting 21.7
indoors 21.1
20s 21.1
businesswoman 20.9
educator 20.9
smiling 20.3
businesspeople 19.9
colleagues 18.5
women 18.2
30s 17.3
happiness 17.2
office 16.9
couple 16.6
student 16.5
talking 16.2
desk 16.1
hospital 15.5
men 15.5
group 15.3
working 15
senior 15
table 14.7
team 14.3
together 14
clothing 14
lab coat 13.8
sitting 13.8
indoor 13.7
40s 13.6
interior 13.3
medical 13.2
cheerful 13
life 12.9
corporate 12.9
smile 12.8
casual 12.7
kin 12.7
bathrobe 12.4
job 12.4
lifestyle 12.3
work 11.8
worker 11.6
coat 11.3
doctor 11.3
computer 11.2
case 11.2
mature 11.2
color 11.1
teamwork 11.1
portrait 11
family 10.7
mid adult 10.6
enrollee 10.5
sick person 10.5
adults 10.4
manager 10.2
two 10.2
communication 10.1
garment 10
boardroom 9.9
associates 9.8
discussing 9.8
coworkers 9.8
attractive 9.8
executive 9.8
health 9.7
four 9.6
face 9.2
laptop 9.1
old 9.1
25 30 years 8.8
discussion 8.8
thirties 8.8
two people 8.8
elderly 8.6
illness 8.6
grandfather 8.4
inside 8.3
occupation 8.3
suit 8.1
to 8
business people 7.9
30 35 years 7.9
forties 7.9
standing 7.8
20 25 years 7.8
cooperation 7.7
modern 7.7
diversity 7.7
twenties 7.7
ethnic 7.6
classroom 7.6
robe 7.5
care 7.4
camera 7.4
new 7.3
mother 7.1
day 7.1
waiter 7

Google
created on 2022-02-05

Style 83.8
Black-and-white 83.5
Chair 76.5
Curtain 74
Event 73.9
Classic 73.9
Monochrome 73.6
Monochrome photography 72.8
Vintage clothing 71.9
Suit 70.9
Team 68.5
Picture frame 67.1
Room 66.4
Sitting 64
History 55.8
Photo caption 55.1
Art 52.1
Retro style 51.8
Crew 51.7
Family 51.6

Microsoft
created on 2022-02-05

person 97.4
wedding dress 96.7
bride 93.1
woman 91.1
clothing 89.2
wedding 83
man 74.5
dress 70.1
text 67.4
wedding cake 59.8
table 53

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 84.5%
Confused 5.3%
Disgusted 2.9%
Happy 2%
Fear 1.9%
Sad 1.6%
Angry 1.1%
Surprised 0.7%

AWS Rekognition

Age 53-61
Gender Male, 95.6%
Calm 91.9%
Sad 6.7%
Angry 0.3%
Happy 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 23-33
Gender Male, 85.8%
Calm 99.7%
Sad 0.2%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 25-35
Gender Male, 97.7%
Calm 99.8%
Sad 0.1%
Disgusted 0%
Surprised 0%
Happy 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 96.9%
Calm 93.5%
Surprised 2.5%
Sad 2%
Fear 0.5%
Happy 0.5%
Disgusted 0.4%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 22-30
Gender Male, 98.9%
Calm 51.9%
Happy 15.1%
Confused 12.6%
Fear 9.5%
Sad 6%
Disgusted 2.4%
Surprised 1.4%
Angry 1%

AWS Rekognition

Age 43-51
Gender Male, 99.3%
Calm 81.9%
Angry 6.6%
Confused 5.1%
Sad 3.4%
Surprised 1.6%
Disgusted 0.7%
Fear 0.5%
Happy 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.4%
Calm 90.9%
Sad 6.8%
Confused 1%
Fear 0.4%
Surprised 0.4%
Disgusted 0.2%
Happy 0.2%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 87.1%
Calm 64.7%
Sad 27.2%
Happy 2.3%
Disgusted 1.9%
Confused 1.6%
Angry 1%
Fear 0.7%
Surprised 0.7%

AWS Rekognition

Age 30-40
Gender Female, 92.1%
Calm 87.7%
Sad 8.8%
Disgusted 1.2%
Surprised 1%
Fear 0.5%
Confused 0.3%
Angry 0.3%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people in a room 92.2%
a group of people sitting on a bed 59.2%
a group of people on a bed 59.1%

Text analysis

Amazon

A70A
MJI7 A70A
MJI7

Google

VOLV
VOLV 2VEEIA EITN
2VEEIA
EITN