Human Generated Data

Title

Untitled (large family portrait in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17927

Human Generated Data

Title

Untitled (large family portrait in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.5
Human 99.5
Person 97.8
Person 97.6
People 97
Person 96.9
Person 96.4
Person 96.3
Person 96
Mammal 95.7
Animal 95.7
Canine 95.7
Dog 95.7
Pet 95.7
Person 95.4
Person 95.3
Person 94.7
Person 94
Clothing 93.6
Apparel 93.6
Person 92.9
Person 92
Family 84.2
Baby 77.3
Indoors 72
Furniture 71.1
Footwear 70
Shoe 70
Overcoat 69.8
Suit 69.8
Coat 69.8
Room 69.6
Female 65.7
Kid 65.5
Child 65.5
Door 62.1
Photography 61.6
Photo 61.6
Dress 60.5
Living Room 59.2
Shorts 58.4
Chair 55.7

Imagga
created on 2022-03-04

person 38.5
male 31.9
man 31.6
teacher 30.9
people 29.5
classroom 24.9
adult 23.4
room 23.3
smiling 21
women 20.5
businessman 20.3
musical instrument 20.1
business 19.4
group 19.3
professional 18.5
brass 18.3
couple 18.3
office 17.8
table 17.5
wind instrument 17.2
sitting 17.2
happy 16.9
holding 16.5
school 16.5
work 16.5
education 16.4
student 16.4
men 15.5
planner 15.2
interior 15
communication 14.3
indoors 14.1
lifestyle 13.7
smile 13.5
team 13.4
chair 13.3
kin 13
executive 13
portrait 12.9
job 12.4
meeting 12.2
cheerful 12.2
senior 12.2
child 11.9
two 11.9
desk 11.4
educator 11.4
learning 11.3
home 11.2
indoor 10.9
suit 10.8
conference 10.7
drinking 10.5
modern 10.5
worker 10.4
boy 10.4
corporate 10.3
manager 10.2
happiness 10.2
businesswoman 10
board 9.9
together 9.6
standing 9.6
businesspeople 9.5
glass 9.3
study 9.3
mature 9.3
teamwork 9.3
outfit 9.2
wine 9.2
confident 9.1
kid 8.9
looking 8.8
students 8.8
colleagues 8.7
class 8.7
love 8.7
workplace 8.6
togetherness 8.5
casual 8.5
enjoyment 8.4
children 8.2
percussion instrument 8.1
handsome 8
family 8
celebration 8
to 8
blackboard 7.9
leisure activity 7.8
teaching 7.8
color 7.8
restaurant 7.8
life 7.7
attractive 7.7
marimba 7.7
cornet 7.6
talking 7.6
wife 7.6
friends 7.5
occupation 7.3
employee 7.3
waiter 7.3

Google
created on 2022-03-04

Style 83.8
Black-and-white 83.4
Curtain 79.4
Suit 76.7
Chair 75.7
Font 73.5
Monochrome 72
Art 71.5
Monochrome photography 70.9
Event 70.8
T-shirt 67.1
Room 66.9
Vintage clothing 66.6
Shorts 66.3
Team 66.2
Musician 64.6
Music 64.4
Guitar 63.9
Stock photography 63
Illustration 62.3

Microsoft
created on 2022-03-04

tennis 99.6
person 98.3
text 96.4
clothing 92.1
player 69
woman 56.7
man 56.6
posing 48.9
male 21.6

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 99.4%
Sad 67.9%
Confused 23.3%
Calm 4.9%
Disgusted 1.4%
Surprised 0.8%
Fear 0.7%
Happy 0.6%
Angry 0.5%

AWS Rekognition

Age 30-40
Gender Female, 57.6%
Happy 72.5%
Calm 20.2%
Sad 3.1%
Surprised 2.7%
Confused 0.5%
Angry 0.3%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 100%
Sad 28.4%
Calm 23%
Happy 16.4%
Confused 13.8%
Surprised 7.9%
Disgusted 4.8%
Angry 4.4%
Fear 1.3%

AWS Rekognition

Age 23-33
Gender Male, 96.1%
Calm 65.1%
Happy 12.9%
Sad 12.8%
Angry 3.3%
Confused 2.5%
Disgusted 1.7%
Surprised 1%
Fear 0.8%

AWS Rekognition

Age 37-45
Gender Male, 84.3%
Sad 73.5%
Calm 17%
Confused 3.1%
Happy 2.5%
Surprised 1.4%
Angry 1.2%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 42-50
Gender Male, 75.6%
Sad 45.8%
Calm 45.3%
Surprised 5.4%
Angry 0.9%
Confused 0.8%
Fear 0.6%
Disgusted 0.6%
Happy 0.5%

AWS Rekognition

Age 40-48
Gender Female, 99.6%
Calm 82.9%
Happy 12.9%
Surprised 2.2%
Sad 0.8%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 97.8%
Sad 82.7%
Confused 11.1%
Calm 2.4%
Happy 1.5%
Surprised 0.7%
Angry 0.7%
Disgusted 0.7%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Sad 70.5%
Calm 20.4%
Confused 5.4%
Happy 1.8%
Angry 0.8%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 70.5%
Surprised 34.6%
Happy 23.8%
Fear 16.1%
Calm 7.6%
Angry 6.3%
Sad 4.6%
Confused 3.6%
Disgusted 3.3%

AWS Rekognition

Age 28-38
Gender Female, 67.5%
Calm 99.2%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Female, 67.5%
Calm 87.1%
Sad 5.1%
Happy 4.4%
Confused 1.8%
Disgusted 0.6%
Angry 0.4%
Surprised 0.3%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Female, 55.9%
Happy 84.5%
Calm 9.4%
Surprised 2.3%
Sad 1.4%
Angry 1.1%
Fear 0.6%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 26-36
Gender Female, 93.5%
Sad 58.8%
Calm 33%
Happy 3.9%
Confused 1.4%
Disgusted 1.1%
Angry 1%
Surprised 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 95.7%
Shoe 70%

Captions

Microsoft

a group of people posing for a photo 91.1%
a group of people posing for a picture 91%
a group of people standing on a court with a racket 78.1%

Text analysis

Amazon

2

Google

MJI7-
-XA
GON
MJI7- -YT3R2- -XA GON
-YT3R2-