Human Generated Data

Title

"Les Pegas" (Seventh of nine): A toreador uses his cape to attract the attention of the mad bull.

Date

April-May 1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.75

Human Generated Data

Title

"Les Pegas" (Seventh of nine): A toreador uses his cape to attract the attention of the mad bull.

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

April-May 1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.75

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Person 99.8
Human 99.8
Person 99.8
Animal 99.7
Mammal 99.7
Bull 99.7
Person 99
Person 98.7
Person 98.2
Bullfighting 95.4
Bullfighter 95.4
Person 95.1
Person 90.1
Horse 82.9
Person 77.8
Building 71.2
Person 62.6

Clarifai
created on 2019-03-22

people 100
group 99.5
group together 99.4
man 98.7
many 98.5
adult 97.7
mammal 91.8
four 91.6
military 91.4
several 90.5
five 89.9
recreation 89.2
wear 88.9
cattle 87.3
child 86.7
woman 85.2
dancing 81.6
war 74.7
administration 74.3
soldier 74

Imagga
created on 2019-03-22

beach 36
ocean 30.3
sand 29
sea 25.8
people 25.7
man 23.5
water 20.7
vacation 19.6
person 19.4
sport 19.2
travel 18.3
silhouette 18.2
summer 18
coast 18
outdoors 16.5
active 16.4
male 16.4
shore 15.8
child 14.9
fun 14.2
walking 14.2
men 13.7
group 13.7
animal 13.6
waves 13
sky 12.8
lifestyle 12.3
adult 11.7
attendant 11.5
performer 11.4
relax 10.9
sunset 10.8
family 10.7
horse 10.6
walk 10.5
couple 10.4
kids 10.4
coastline 10.3
outdoor 9.9
dancer 9.9
run 9.6
wave 9.5
life 9.5
happy 9.4
holiday 9.3
leisure 9.1
tourism 9.1
surf 8.7
tropical 8.5
winter 8.5
kin 8.3
recreation 8.1
bullring 7.9
together 7.9
happiness 7.8
black 7.8
horses 7.8
play 7.8
bull 7.6
tourist 7.6
evening 7.5
competition 7.3
danger 7.3
sun 7.2
women 7.1
love 7.1
day 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

outdoor 98.2
person 96.8
people 59.8
beach 53.4
child 32.5
black and white 27.4
boy 21.3
group 19.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 54%
Confused 45.3%
Calm 48.8%
Disgusted 45.6%
Surprised 45.7%
Sad 48.2%
Angry 46.1%
Happy 45.5%

AWS Rekognition

Age 20-38
Gender Female, 54.4%
Confused 45.3%
Angry 45.9%
Disgusted 45.3%
Sad 50.5%
Calm 46.8%
Happy 45.5%
Surprised 45.7%

AWS Rekognition

Age 26-43
Gender Female, 53%
Disgusted 45.8%
Confused 45.2%
Happy 45.8%
Calm 46.4%
Angry 49.6%
Surprised 45.7%
Sad 46.5%

AWS Rekognition

Age 23-38
Gender Male, 53.3%
Happy 45%
Angry 45.3%
Sad 49.4%
Calm 49.9%
Surprised 45.2%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Angry 45.4%
Surprised 45.1%
Disgusted 45.5%
Sad 45.1%
Calm 53.4%
Happy 45.3%
Confused 45.2%

AWS Rekognition

Age 10-15
Gender Male, 54.1%
Sad 45.1%
Disgusted 45%
Surprised 45.1%
Angry 45.1%
Calm 54.5%
Happy 45%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 52.3%
Angry 46.3%
Calm 45.4%
Confused 45.1%
Disgusted 45.1%
Happy 45.1%
Sad 52.9%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.7%
Calm 47.1%
Sad 49.6%
Disgusted 45.1%
Surprised 45.4%
Confused 45.5%
Happy 46.5%
Angry 45.9%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Horse 82.9%

Categories