Human Generated Data

Title

Untitled (group of men and women standing in front of awning at Uhland Community Fair)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11251

Human Generated Data

Title

Untitled (group of men and women standing in front of awning at Uhland Community Fair)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.6
Person 99
Person 98.9
Person 98.8
Person 96.6
Military 94.5
People 93.7
Military Uniform 93.5
Officer 88
Person 75
Person 72.5
Army 69.9
Armored 69.9
Person 69.2
Troop 69
Soldier 62.8
Sailor Suit 58.2
Crowd 55.3
Person 43.1

Imagga
created on 2022-01-23

military uniform 59.6
uniform 57.2
clothing 44.1
covering 28.2
man 28.2
consumer goods 25.9
people 22.9
person 22.4
male 19.1
nurse 16.1
military 15.4
athlete 15
adult 15
helmet 14.4
danger 13.6
sport 13.3
walking 13.3
ballplayer 13.2
player 12.9
war 12.5
street 12
commodity 11.9
pedestrian 11.8
city 11.6
protection 10.9
boy 10.4
portrait 10.3
men 10.3
outdoor 9.9
outdoors 9.7
weapon 9.7
nuclear 9.7
walk 9.5
football helmet 9.3
travel 9.2
industrial 9.1
old 9.1
contestant 9
mask 8.8
toxic 8.8
army 8.8
together 8.8
women 8.7
armor 8.6
two 8.5
dark 8.3
holding 8.3
gun 8.1
dirty 8.1
transportation 8.1
group 8.1
to 8
urban 7.9
radioactive 7.8
radiation 7.8
soldier 7.8
destruction 7.8
black 7.8
protective 7.8
chemical 7.7
gas 7.7
winter 7.7
dangerous 7.6
human 7.5
world 7.4
tradition 7.4
shield 7.3
brass 7.2
family 7.1
day 7.1

Google
created on 2022-01-23

Window 91.4
Standing 86.4
Headgear 81.9
Motor vehicle 78.6
Crew 74.9
Vintage clothing 72.3
Team 69.9
Uniform 67.9
Event 66.7
History 65.1
Monochrome 64.6
Classic 62.2
Hat 55.6

Microsoft
created on 2022-01-23

person 99.9
outdoor 99
posing 97.5
grass 97
clothing 95.1
standing 93.9
man 91.5
black 88
text 87.5
group 87.2
old 86.4
white 77
team 74.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-47
Gender Female, 51.3%
Calm 99.1%
Sad 0.2%
Angry 0.2%
Confused 0.2%
Happy 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Calm 87.5%
Happy 8.6%
Surprised 1.6%
Angry 0.7%
Disgusted 0.5%
Confused 0.4%
Sad 0.4%
Fear 0.3%

AWS Rekognition

Age 53-61
Gender Female, 86.8%
Sad 77.3%
Calm 18.2%
Fear 2.5%
Confused 0.8%
Disgusted 0.6%
Happy 0.3%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 36-44
Gender Female, 100%
Happy 99.3%
Surprised 0.2%
Confused 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Calm 0.1%
Sad 0.1%

AWS Rekognition

Age 58-66
Gender Female, 99.4%
Disgusted 36.3%
Angry 18%
Happy 14.2%
Calm 12.2%
Surprised 7.3%
Fear 4.7%
Sad 4.2%
Confused 3%

AWS Rekognition

Age 35-43
Gender Female, 100%
Sad 50%
Disgusted 43.1%
Confused 3.9%
Angry 1.7%
Fear 0.5%
Calm 0.4%
Surprised 0.2%
Happy 0.2%

AWS Rekognition

Age 49-57
Gender Female, 100%
Happy 95.3%
Calm 3.7%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 48-54
Gender Male, 94.5%
Calm 84.2%
Angry 14.4%
Sad 1%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Calm 70.8%
Sad 13.4%
Surprised 6.5%
Happy 3.5%
Angry 3.1%
Disgusted 1.1%
Fear 0.9%
Confused 0.7%

AWS Rekognition

Age 47-53
Gender Female, 100%
Calm 57.9%
Sad 17.3%
Happy 11.7%
Confused 4.6%
Fear 3.2%
Disgusted 2.5%
Angry 1.5%
Surprised 1.3%

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people posing for a photo 98.4%
an old photo of a group of people posing for the camera 97.8%
an old photo of a group of people posing for a picture 97.7%