Human Generated Data

Title

Untitled (family portrait, outside house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16451

Human Generated Data

Title

Untitled (family portrait, outside house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.5
Person 99.5
Person 99.4
Person 99.2
Person 97.3
People 95.3
Person 95
Chair 90.3
Furniture 90.3
Face 83.6
Family 82.2
Accessories 81.2
Accessory 81.2
Tie 81.2
Apparel 79.5
Clothing 79.5
Suit 79
Coat 79
Overcoat 79
Dress 73.7
Female 69.1
Photography 68.9
Portrait 68.9
Photo 68.9
Plant 64.5
Grass 64.5
Home Decor 59.8
Outdoors 59.5
Floor 57.5
Smile 57.4
Musical Instrument 55.7

Imagga
created on 2022-02-11

crutch 100
staff 81.2
stick 61
nurse 32.8
person 27.3
man 26.9
people 25.6
male 19.1
chair 18
adult 17.5
wheelchair 17.3
planner 16.7
old 14.6
home 14.3
health 13.9
smiling 13.7
men 13.7
family 13.3
care 13.2
happy 13.1
senior 13.1
brass 12.4
medical 11.5
black 11.4
couple 11.3
outdoors 11.2
sitting 11.2
women 11.1
business 10.9
disabled 10.8
trombone 10.7
indoors 10.5
professional 10.2
patient 10.1
businessman 9.7
portrait 9.7
retired 9.7
sick 9.6
elderly 9.6
standing 9.5
hospital 9.5
happiness 9.4
help 9.3
smile 9.3
human 9
together 8.8
seat 8.7
life 8.7
wind instrument 8.7
retirement 8.6
day 8.6
room 8.6
sport 8.4
window 8.2
lifestyle 7.9
illness 7.6
wheel 7.5
future 7.4
cheerful 7.3
alone 7.3
aged 7.2
looking 7.2
love 7.1

Google
created on 2022-02-11

Furniture 94.5
Chair 91.4
Vintage clothing 76.4
Monochrome 74.6
Door 72.5
Classic 72.3
Room 71.9
Monochrome photography 70.1
Sitting 69.6
Event 66.8
History 64.8
Suit 64.1
Picture frame 60.1
Retro style 55.3
Family 52.8

Microsoft
created on 2022-02-11

outdoor 97.5
text 88.8
person 88.7
clothing 76.5
posing 62.7
white 60.3
chair 56.2
old 46.8
seat 45.3

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 99.8%
Calm 96.8%
Surprised 2.4%
Confused 0.2%
Sad 0.2%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 99.2%
Happy 60.4%
Calm 22.9%
Disgusted 6.7%
Surprised 3.2%
Sad 3.1%
Angry 1.6%
Confused 1.1%
Fear 1.1%

AWS Rekognition

Age 53-61
Gender Male, 99.9%
Calm 81.8%
Surprised 8.4%
Happy 4.9%
Disgusted 2.5%
Angry 0.9%
Sad 0.7%
Confused 0.7%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 58.4%
Calm 97.5%
Happy 1.8%
Surprised 0.2%
Confused 0.2%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 97%
Calm 56.3%
Surprised 13.1%
Confused 12%
Sad 6.8%
Disgusted 4.9%
Angry 3.8%
Happy 1.8%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 90.3%
Tie 81.2%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 93.1%
a vintage photo of a group of people posing for a picture 93%
a group of people posing for a photo 92.9%

Text analysis

Amazon

5