Human Generated Data

Title

Untitled (young man and kids sitting on ground, woman standing)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16351

Human Generated Data

Title

Untitled (young man and kids sitting on ground, woman standing)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Apparel 99.3
Clothing 99.3
Human 99.1
Person 99.1
Person 98.5
Person 93.7
Outdoors 92.2
Nature 92
Person 89.5
Person 89.5
Person 84.4
Person 74.7
Countryside 72.9
Hat 71.5
People 67.1
Portrait 65.6
Photo 65.6
Photography 65.6
Face 65.6
Helmet 62.3
Overcoat 59.5
Coat 59.5
Rural 59.2
Grass 57.9
Plant 57.9
Suit 57.2
Military Uniform 56.9
Military 56.9
Person 48.7

Imagga
created on 2022-02-11

brass 100
wind instrument 85
musical instrument 62.5
landscape 22.3
snow 20.6
man 17.5
outdoors 17.2
outdoor 16.8
shovel 16.3
winter 16.2
old 15.3
male 14.9
sky 14.7
cold 14.6
tree 14.6
beach 14.3
people 13.4
water 13.3
sand 13.1
summer 12.9
field 12.5
vacation 12.3
rural 11.5
outside 11.1
adult 11
holiday 10.7
tool 10.6
forest 10.4
relax 10.1
park 9.9
trees 9.8
lifestyle 9.4
sea 9.4
season 9.3
travel 9.1
scenery 9
black 9
country 8.8
hand tool 8.8
snowy 8.7
scene 8.7
person 8.5
weather 8.3
work 8.2
sunset 8.1
scenic 7.9
portrait 7.8
frost 7.7
building 7.6
sport 7.5
ocean 7.5
vintage 7.4
countryside 7.3
sun 7.2
active 7.2
day 7.1
happiness 7

Microsoft
created on 2022-02-11

outdoor 96.7
text 77.5
clothing 68.4
black and white 65.3
person 57.6

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 72.5%
Happy 62.4%
Surprised 19.5%
Calm 15.4%
Sad 0.7%
Confused 0.7%
Disgusted 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Male, 100%
Happy 52.3%
Calm 18.2%
Surprised 13.6%
Sad 11.7%
Fear 1.7%
Angry 1%
Disgusted 0.9%
Confused 0.7%

AWS Rekognition

Age 30-40
Gender Female, 53.5%
Calm 99.7%
Happy 0.2%
Sad 0.1%
Disgusted 0%
Angry 0%
Surprised 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.1%
Happy 96.4%
Calm 2%
Fear 0.5%
Disgusted 0.3%
Surprised 0.3%
Angry 0.2%
Sad 0.2%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Female, 63.4%
Happy 77.3%
Sad 18.3%
Surprised 1.5%
Calm 1.3%
Fear 0.8%
Angry 0.5%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 29-39
Gender Male, 67.3%
Happy 79.7%
Calm 12.1%
Sad 5.9%
Disgusted 0.7%
Fear 0.5%
Confused 0.4%
Surprised 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Helmet 62.3%

Captions

Microsoft

a group of people posing for a photo in front of a window 69.8%
a group of people posing for a photo 67.8%
a group of people posing for the camera 67.7%

Text analysis

Amazon

21.
rap
KODVK-SVEELX

Google

21.
21.