Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.78

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.78

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.6
Person 98.9
Person 98.7
Porch 96.7
Person 93.7
Patio 92.7
Shelter 87.5
Outdoors 87.5
Nature 87.5
Building 87.5
Countryside 87.5
Rural 87.5
Pergola 84.3
Clothing 83.8
Apparel 83.8
Person 82.9
Shorts 80.3
Shoe 76.7
Footwear 76.7
Shoe 67.1
People 64.4
Person 59.7
Person 57.8
Crowd 56.1

Clarifai
created on 2023-10-25

people 100
group 99.9
group together 99.8
many 99.3
child 98.9
adult 98.8
several 98.4
man 97.8
woman 97.4
wear 95.4
furniture 94.9
recreation 94.6
five 94.3
four 94.2
home 93.6
boy 92.8
vehicle 89.9
administration 87.5
leader 86.8
education 84.6

Imagga
created on 2022-01-09

percussion instrument 30.1
musical instrument 27
drum 19
house 18.4
man 17.5
home 16.7
building 16.7
people 16.2
adult 15.5
marimba 14.4
village 14.3
male 13.6
hut 13.3
old 13.2
outdoors 12.7
architecture 12.5
day 11.8
structure 11.7
person 11.1
child 10.6
roof 10.6
holiday 10
city 10
farm 9.8
standing 9.5
sitting 9.4
waiter 9.4
smiling 9.4
work 9.1
industry 8.5
historic 8.2
happy 8.1
school 8.1
wooden 7.9
seller 7.8
men 7.7
tree 7.7
religious 7.5
traditional 7.5
tradition 7.4
vacation 7.4
window 7.3
new 7.3
worker 7.1
portrait 7.1
summer 7.1
travel 7

Google
created on 2022-01-09

Building 85.4
Table 84.4
House 81.2
Window 78
Tints and shades 77.2
Chair 77.1
Vintage clothing 75.3
Recreation 71.6
Room 69.1
Event 68.7
Monochrome 68.4
History 68.1
Tree 61.3
Team 60.4
Classic 52.1
Hat 50.9

Microsoft
created on 2022-01-09

outdoor 97.8
clothing 97.3
person 92.8
man 89.6
old 88.8
people 75.7
footwear 72.1
group 67.4
posing 38.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-41
Gender Male, 99.5%
Sad 85.6%
Calm 9.7%
Confused 1.4%
Happy 1.3%
Disgusted 0.7%
Angry 0.5%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 28-38
Gender Male, 73.2%
Sad 58.6%
Confused 13.5%
Calm 9.4%
Angry 6.7%
Disgusted 6.6%
Surprised 2.4%
Fear 1.6%
Happy 1.1%

AWS Rekognition

Age 26-36
Gender Male, 87.8%
Calm 45.3%
Sad 40.1%
Fear 3.7%
Angry 2.8%
Confused 2.7%
Disgusted 2.5%
Surprised 2.2%
Happy 0.6%

AWS Rekognition

Age 21-29
Gender Female, 97.4%
Calm 58.4%
Angry 17.6%
Disgusted 7.7%
Confused 4.7%
Happy 4.7%
Sad 3%
Surprised 2.1%
Fear 1.9%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Angry 42.6%
Surprised 19.5%
Fear 14.3%
Calm 12.6%
Disgusted 3.3%
Sad 3.3%
Happy 2.9%
Confused 1.7%

AWS Rekognition

Age 39-47
Gender Male, 98.8%
Sad 94%
Calm 4.8%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 31-41
Gender Female, 95.8%
Angry 99.3%
Sad 0.5%
Calm 0.1%
Disgusted 0%
Fear 0%
Surprised 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 24-34
Gender Female, 98.5%
Happy 97.9%
Fear 0.8%
Surprised 0.4%
Calm 0.3%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 26-36
Gender Male, 96.8%
Calm 61.4%
Sad 34.7%
Confused 1.2%
Fear 1%
Disgusted 0.6%
Angry 0.5%
Surprised 0.4%
Happy 0.4%

AWS Rekognition

Age 23-33
Gender Male, 98.3%
Sad 64.7%
Angry 18.9%
Fear 12.3%
Calm 1.7%
Disgusted 1.3%
Confused 0.5%
Happy 0.3%
Surprised 0.3%

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Likely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 76.7%