Human Generated Data

Title

Simla, Hill Coolies with Khilhat and Dandy

Date

1860s

People

Artist: Samuel Bourne and Charles Shepherd, British, English in partnership 1863-1870

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. R. Minturn Sedgwick in memory of R. Minturn Sedgwick, Class of 1921, P1981.28

Human Generated Data

Title

Simla, Hill Coolies with Khilhat and Dandy

People

Artist: Samuel Bourne and Charles Shepherd, British, English in partnership 1863-1870

Date

1860s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. R. Minturn Sedgwick in memory of R. Minturn Sedgwick, Class of 1921, P1981.28

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.8
Person 98.2
Person 98.1
Painting 94.2
Art 94.2
Person 93.6
People 76.1
Tribe 61.1

Clarifai
created on 2023-10-26

people 100
group 99.3
wear 98.8
adult 98.5
drum 98
group together 97.9
outfit 97.4
man 97.3
percussion instrument 96.2
child 95.3
music 94.4
three 92.8
four 92.6
veil 92.4
many 91.2
several 90
two 89.3
musician 88.6
leader 88.5
uniform 88.3

Imagga
created on 2022-01-23

person 23.1
man 22.8
male 19.3
sport 18.3
outdoor 15.3
people 15.1
protection 13.6
danger 13.6
statue 13.5
uniform 12.8
military 12.6
player 12.5
athlete 12.3
sky 12.1
attendant 12
clothing 12
old 11.8
adult 11.8
toxic 11.7
mask 11.7
weapon 11.1
soldier 10.7
protective 10.7
nuclear 10.7
travel 10.6
summer 10.3
competition 10.1
industrial 10
dirty 9.9
ballplayer 9.9
radioactive 9.8
radiation 9.8
war 9.7
chemical 9.6
gas 9.6
sculpture 9.6
beach 9.3
black 9
sunset 9
outdoors 9
brass 8.9
helmet 8.9
destruction 8.8
accident 8.8
boy 8.7
dangerous 8.6
horse 8.5
trombone 8.3
world 8.2
wind instrument 8.1
military uniform 7.9
stalker 7.9
grass 7.9
art 7.9
planner 7.9
child 7.8
ancient 7.8
death 7.7
industry 7.7
dark 7.5
silhouette 7.4
tradition 7.4
musical instrument 7.3
religion 7.2
history 7.2
game 7.1
portrait 7.1
contestant 7.1
day 7.1
animal 7.1
gun 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.1
outdoor 94.8
clothing 93.6
person 88.4
old 85.9
player 64.2
posing 55.8
photograph 55.4
man 53.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Calm 94.6%
Surprised 3.4%
Happy 0.6%
Disgusted 0.5%
Angry 0.5%
Confused 0.2%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 35-43
Gender Male, 100%
Confused 40.9%
Calm 33.8%
Sad 11.3%
Angry 5.5%
Fear 3.6%
Surprised 2.2%
Happy 1.5%
Disgusted 1.4%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 98.6%
Sad 0.5%
Confused 0.5%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 97.2%
Sad 1.2%
Confused 0.6%
Angry 0.3%
Disgusted 0.3%
Happy 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 100%
Calm 43.6%
Sad 19.1%
Fear 14%
Angry 5.6%
Happy 5.4%
Disgusted 4.4%
Surprised 4.1%
Confused 3.8%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 61
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 94.2%

Categories