Human Generated Data

Title

Untitled (group of people wearing headdresses in front of tipi)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16902

Human Generated Data

Title

Untitled (group of people wearing headdresses in front of tipi)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Outdoors 99.9
Shelter 99.9
Building 99.9
Rural 99.9
Nature 99.9
Countryside 99.9
Human 98.8
Person 98.8
Person 98
Tree 97.8
Plant 97.8
Vegetation 97.2
Yard 95.4
Person 94.9
Person 94.5
Woodland 92.5
Forest 92.5
Land 92.5
Face 92.3
Chair 91.3
Furniture 91.3
Grove 90.1
Female 89.7
Grass 89.2
Sand 86.6
Dress 84.2
Clothing 84.2
Apparel 84.2
People 78.9
Person 78.7
Person 76
Woman 74.6
Person 74.3
Child 73.5
Kid 73.5
Person 72
Coast 71
Beach 71
Sea 71
Shoreline 71
Water 71
Ocean 71
Portrait 70.1
Photography 70.1
Photo 70.1
Girl 69.4
Field 67.8
Grassland 67.8
Crowd 65.6
Transportation 58.5
Vehicle 58.5
Boat 58.5
Teen 58.2
Vacation 57.7
Leisure Activities 55.9
Lawn 55.7
Park 55.7
Camping 55.6
Play 55.6
Person 54.8

Imagga
created on 2022-02-26

canvas tent 100
tent 26.3
travel 21.1
snow 21
winter 20.4
sky 17.9
shelter 17.5
trees 16.9
outdoor 16.8
tree 16.1
mountain tent 15.7
architecture 15.6
landscape 14.9
tourism 14
old 13.9
forest 13.9
vacation 13.9
weather 13.7
summer 13.5
park 13.2
umbrella 13.1
structure 12
parasol 12
history 11.6
building 11.5
water 11.3
cold 11.2
camping 10.8
scenic 10.5
sun 10.5
outdoors 10.4
grass 10.3
season 10.1
holiday 10
rock 9.6
ancient 9.5
church 9.2
city 9.1
religion 9
people 8.9
culture 8.5
adventure 8.5
beach 8.4
leisure 8.3
fun 8.2
sand 8
camp 7.9
person 7.8
stone 7.8
wood 7.5
mountains 7.4
color 7.2
scenery 7.2
sunset 7.2
recreation 7.2
rural 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 96.8
person 92.4
clothing 88.3
text 64.1
people 60.8
man 54.4

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 67%
Sad 51.1%
Happy 14.8%
Calm 8.5%
Angry 7.2%
Surprised 5.8%
Disgusted 5.2%
Confused 4.5%
Fear 3%

AWS Rekognition

Age 33-41
Gender Male, 99.4%
Sad 48.6%
Confused 43.5%
Calm 2.4%
Disgusted 2.1%
Happy 1.8%
Fear 0.7%
Surprised 0.5%
Angry 0.5%

AWS Rekognition

Age 40-48
Gender Female, 62.7%
Happy 52.7%
Calm 26%
Disgusted 7.1%
Sad 5.5%
Surprised 3.2%
Angry 2.5%
Fear 2%
Confused 0.8%

AWS Rekognition

Age 23-31
Gender Male, 96.5%
Sad 81.3%
Confused 8%
Calm 5.5%
Disgusted 1.6%
Happy 1.1%
Fear 1.1%
Surprised 0.7%
Angry 0.7%

AWS Rekognition

Age 34-42
Gender Male, 99.8%
Calm 48.2%
Sad 37.5%
Happy 10.8%
Confused 1.9%
Disgusted 0.5%
Fear 0.4%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 22-30
Gender Female, 51.4%
Calm 33.3%
Sad 27.2%
Confused 24.7%
Happy 10.8%
Angry 1.5%
Disgusted 1.3%
Fear 0.7%
Surprised 0.6%

AWS Rekognition

Age 29-39
Gender Male, 73.1%
Sad 100%
Confused 0%
Angry 0%
Calm 0%
Disgusted 0%
Happy 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 14-22
Gender Female, 90.3%
Happy 72.8%
Calm 19.7%
Sad 2.3%
Disgusted 1.8%
Confused 1.7%
Angry 0.7%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Female, 61.2%
Happy 95.1%
Calm 3.8%
Disgusted 0.3%
Sad 0.2%
Confused 0.2%
Surprised 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 21-29
Gender Male, 89.2%
Sad 62%
Angry 12.5%
Calm 6.3%
Surprised 6.2%
Disgusted 5.1%
Confused 4.1%
Happy 2.6%
Fear 1.2%

AWS Rekognition

Age 29-39
Gender Male, 86.4%
Happy 58.9%
Sad 17.5%
Calm 15.6%
Confused 3.6%
Angry 1.5%
Disgusted 1.2%
Surprised 0.9%
Fear 0.8%

AWS Rekognition

Age 31-41
Gender Male, 74.1%
Sad 97.7%
Happy 0.7%
Calm 0.7%
Confused 0.4%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 26-36
Gender Female, 98.6%
Fear 47.9%
Calm 16.2%
Sad 11.5%
Happy 10.6%
Surprised 5.3%
Confused 4%
Disgusted 2.5%
Angry 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Boat 58.5%

Captions

Microsoft

a group of people posing for a photo 76.6%
a group of people posing for a picture 76.5%
a group of people posing for the camera 76.4%

Text analysis

Amazon

3
MÍI7--YT37- -

Google

MJ17--YT33A°2-
MJ17--YT33A°2- -AGON
-AGON