Human Generated Data

Title

1 de Enero de 1959 (La Havana)

Date

1959, printed 2001

People

Artist: Ernesto Fernández, Cuban b. 1939?

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the David Rockefeller Center for Latin American Studies, Harvard University, Gift of the artist, 2012.158

Human Generated Data

Title

1 de Enero de 1959 (La Havana)

People

Artist: Ernesto Fernández, Cuban b. 1939?

Date

1959, printed 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the David Rockefeller Center for Latin American Studies, Harvard University, Gift of the artist, 2012.158

Machine Generated Data

Tags

Amazon
created on 2023-07-07

War 99.9
People 99.7
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.8
Adult 97.8
Male 97.8
Man 97.8
Person 97.6
Person 96.6
Adult 96.6
Male 96.6
Man 96.6
Person 95.8
Adult 95.8
Male 95.8
Man 95.8
Person 95.3
Person 94.9
Adult 94.9
Male 94.9
Man 94.9
Person 94.4
Adult 94.4
Male 94.4
Man 94.4
Clothing 92.6
Footwear 92.6
Shoe 92.6
Person 89.7
Adult 89.7
Female 89.7
Woman 89.7
Person 88.1
Shoe 86.6
Military 85.9
Person 84.8
Shoe 84.6
Face 82.8
Head 82.8
Person 80.3
Person 80.3
Car 80.2
Transportation 80.2
Vehicle 80.2
Accessories 79.3
Bag 79.3
Handbag 79.3
Armored 78.3
Tank 78.3
Weapon 78.3
Person 77.3
Person 72.3
Person 70.3
Person 69.5
Flag 67.1
Shoe 63.3
Person 63.1
Shoe 59
Military Uniform 57.6
Shoe 57
Soldier 57
Army 55.8
License Plate 55.8

Clarifai
created on 2023-10-13

people 100
group together 99.8
many 99.7
group 99.7
military 98.9
adult 98.8
vehicle 97.9
man 97.1
war 96.7
soldier 95.8
several 95.6
administration 93.6
outfit 93.4
woman 92
wear 91.8
skirmish 90.8
crowd 89.9
leader 89.9
military uniform 87.9
uniform 86.4

Imagga
created on 2023-07-07

military uniform 45.3
uniform 37.6
clothing 29.3
vehicle 21.4
covering 19.1
consumer goods 18.3
tank 16.9
military vehicle 16.6
city 16.6
private 16.6
tracked vehicle 16.5
people 15.6
architecture 14.8
old 14.6
musical instrument 14.2
seller 13.6
military 12.5
history 12.5
person 12.5
ancient 12.1
tourism 11.5
war 11.5
tree 11.5
man 11.4
outdoors 11.2
vintage 10.7
male 10.7
armored vehicle 10.6
travel 10.6
statue 10.5
portrait 10.3
world 10.3
historic 10.1
sky 9.6
building 9.5
percussion instrument 9.5
happy 9.4
culture 9.4
commodity 9.1
park 9.1
aged 9
religion 9
weapon 8.8
boy 8.7
sculpture 8.7
water 8.7
mother 8.6
conveyance 8.4
wheeled vehicle 8.3
transportation 8.1
lifestyle 7.9
warfare 7.9
holiday 7.9
army 7.8
machine 7.8
sitting 7.7
men 7.7
winter 7.7
outdoor 7.6
half track 7.4
teen 7.3
cheerful 7.3
child 7.3
new 7.3
business 7.3
smiling 7.2
landmark 7.2
drum 7.2
home 7.2
family 7.1

Google
created on 2023-07-07

Microsoft
created on 2023-07-07

person 99.6
text 92.4
group 88.2
people 83.2
old 80.6
vehicle 80.1
clothing 72.4
man 72.2
posing 54.9
vintage 25.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 6-16
Gender Male, 96.1%
Happy 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.1%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 12-20
Gender Female, 99.9%
Calm 95.9%
Surprised 6.8%
Fear 6%
Sad 2.4%
Happy 0.7%
Angry 0.6%
Confused 0.6%
Disgusted 0.3%

AWS Rekognition

Age 21-29
Gender Female, 94%
Sad 99.3%
Happy 21.1%
Fear 8.5%
Calm 7.5%
Surprised 7%
Angry 1.1%
Disgusted 1%
Confused 0.5%

AWS Rekognition

Age 36-44
Gender Male, 99%
Calm 89.7%
Sad 7.5%
Surprised 6.3%
Fear 6.1%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Sad 100%
Calm 7.1%
Surprised 6.3%
Fear 6.1%
Confused 3.6%
Angry 0.9%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Calm 98.4%
Surprised 6.7%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 25-35
Gender Female, 99.7%
Happy 70%
Calm 12%
Surprised 11.5%
Fear 6.8%
Sad 2.9%
Angry 2.4%
Disgusted 1.9%
Confused 1.5%

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Calm 43.7%
Fear 22.2%
Angry 10.9%
Surprised 9.2%
Sad 5%
Happy 4.5%
Confused 4.4%
Disgusted 2.9%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Happy 96.9%
Surprised 6.5%
Fear 6%
Sad 2.2%
Calm 1.7%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 35-43
Gender Female, 72%
Calm 90.9%
Surprised 7.9%
Fear 6.2%
Sad 3.1%
Disgusted 1%
Happy 0.9%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 23-33
Gender Male, 99.2%
Calm 33.5%
Sad 30.7%
Angry 21.1%
Surprised 10.6%
Fear 9.2%
Confused 5.3%
Happy 2.7%
Disgusted 2.4%

AWS Rekognition

Age 23-31
Gender Male, 83.1%
Calm 96.7%
Surprised 6.5%
Fear 6%
Sad 2.5%
Happy 0.9%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 6-16
Gender Male, 99.4%
Calm 91.3%
Surprised 6.4%
Fear 5.9%
Sad 3.8%
Angry 2.2%
Confused 1.1%
Happy 0.6%
Disgusted 0.2%

AWS Rekognition

Age 24-34
Gender Female, 69%
Calm 39%
Sad 22.5%
Surprised 18.4%
Happy 12.1%
Fear 8%
Angry 6.4%
Confused 2.9%
Disgusted 2.4%

AWS Rekognition

Age 16-24
Gender Female, 59.1%
Sad 100%
Calm 14.3%
Surprised 6.6%
Fear 6.3%
Happy 1.3%
Disgusted 1.2%
Confused 0.6%
Angry 0.5%

AWS Rekognition

Age 18-26
Gender Male, 72.8%
Calm 79.5%
Fear 7%
Surprised 6.9%
Disgusted 6.1%
Sad 4.1%
Confused 2.5%
Happy 2.4%
Angry 0.7%

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Adult 98.9%
Male 98.9%
Man 98.9%
Shoe 92.6%
Female 89.7%
Woman 89.7%
Car 80.2%
Handbag 79.3%
Flag 67.1%

Text analysis

Amazon

CLET
Polar

Google

I TELL Polar CHESTOLET
I
TELL
Polar
CHESTOLET