Human Generated Data

Title

Untitled (group portrait of two men and two women)

Date

1970s

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.528

Human Generated Data

Title

Untitled (group portrait of two men and two women)

People

Artist: Bachrach Studios, founded 1868

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 100
Couch 99.9
Person 99.5
Human 99.5
Person 99.3
Person 98.7
Person 96.9
Mammal 96.9
Dog 96.9
Animal 96.9
Canine 96.9
Pet 96.9
Armchair 86
Home Decor 74.5
Overcoat 71.2
Suit 71.2
Coat 71.2
Apparel 71.2
Clothing 71.2
Living Room 69.9
Indoors 69.9
Room 69.9

Imagga
created on 2022-02-26

room 27.5
groom 24.1
adult 22.7
interior 22.1
people 20.6
person 20.3
man 19.5
male 16.4
luxury 16.3
happy 16.3
home 15.1
house 15
pretty 14.7
furniture 14.1
couple 13.1
sitting 12.9
rug 12.7
family 12.4
floor 12.1
two 11.9
relaxation 11.7
portrait 11.6
wall 11.4
bedroom 11.2
life 11.1
relax 10.9
smiling 10.8
lifestyle 10.8
wood 10.8
wicker 10.8
holiday 10.7
smile 10.7
men 10.3
women 10.3
work 10.1
indoor 10
face 9.9
fashion 9.8
lady 9.7
living 9.5
love 9.5
happiness 9.4
architecture 9.4
elegance 9.2
old 9.1
human 9
new 8.9
indoors 8.8
sofa 8.8
standing 8.7
product 8.6
day 8.6
estate 8.5
culture 8.5
modern 8.4
chair 8.3
inside 8.3
window 8.2
outdoors 8.2
together 7.9
sea 7.8
covering 7.7
outdoor 7.6
real 7.6
traditional 7.5
style 7.4
design 7.3
prayer rug 7.3
child 7.2
copy space 7.2
fireplace 7.2
looking 7.2
cute 7.2
decor 7.1
table 7
wooden 7

Google
created on 2022-02-26

Rectangle 88.2
Picture frame 88.2
Dress 87.1
Couch 86.7
Window 86.1
Wood 85
Art 83.4
Comfort 81.2
Tints and shades 77.3
Vintage clothing 75.9
Event 72.2
Painting 70.8
Sitting 69.4
Formal wear 68.7
Suit 68.4
Room 68.1
Visual arts 65.9
Chair 65.2
Interior design 59.4
Illustration 55

Microsoft
created on 2022-02-26

wall 99
indoor 98.4
room 95
gallery 93.9
dog 91.8
clothing 90.8
carnivore 87.5
person 86.1
painting 85.6
scene 80.6
woman 64.7
picture frame 6.9

Face analysis

Amazon

Google

AWS Rekognition

Age 14-22
Gender Female, 99.6%
Happy 98.6%
Calm 0.5%
Confused 0.2%
Disgusted 0.2%
Sad 0.2%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 34-42
Gender Male, 100%
Happy 85.8%
Calm 9.7%
Angry 1%
Confused 0.9%
Sad 0.9%
Surprised 0.8%
Fear 0.5%
Disgusted 0.5%

AWS Rekognition

Age 28-38
Gender Female, 100%
Happy 99%
Surprised 0.3%
Calm 0.2%
Angry 0.2%
Fear 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 93.8%
Sad 2%
Fear 1.5%
Angry 1.3%
Confused 0.5%
Disgusted 0.3%
Surprised 0.3%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 96.9%
Suit 71.2%

Captions

Microsoft

a person sitting in a room 75.4%
a person standing in a room 75.3%
a person sitting in front of a mirror 51.7%