Human Generated Data

Title

Untitled (Uhland Fair, group of men and women standing outside tent)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2810

Human Generated Data

Title

Untitled (Uhland Fair, group of men and women standing outside tent)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2810

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.9
Human 99.9
Person 99.7
Person 99.7
Person 99.7
Person 99.4
Person 99
Person 98.9
Person 98.8
Military 97.1
Person 97
Military Uniform 96.6
Officer 91.6
Clothing 91
Apparel 91
Person 86.9
Sailor Suit 79.9
Soldier 79.4
People 78.2
Person 70.1
Army 67.8
Armored 67.8
Shorts 62
Person 62
Window 56

Clarifai
created on 2023-10-26

people 99.8
group together 97.4
uniform 95.4
group 94.7
adult 92.6
wear 92.6
man 92.3
many 91.8
military 89.5
administration 87.8
outfit 87.5
child 85.9
war 85.3
soldier 80
veil 77
woman 76.3
recreation 74.8
leader 73.9
boy 73.2
family 73

Imagga
created on 2022-01-16

wind instrument 48.5
musical instrument 47.4
brass 46.5
trombone 34.1
man 25.5
people 24
silhouette 20.7
accordion 19.5
male 19.1
adult 17.6
men 17.2
group 16.9
person 16.8
keyboard instrument 15.6
uniform 15.4
clothing 14.9
outdoors 11.9
military 11.6
military uniform 11.4
walking 11.4
women 11.1
black 11
business 10.9
protection 10.9
outfit 10.9
holding 10.7
crowd 10.6
girls 10
outdoor 9.9
businessman 9.7
together 9.6
sport 9.3
active 9.2
travel 9.1
dirty 9
sunset 9
sky 8.9
soldier 8.8
couple 8.7
day 8.6
dark 8.3
danger 8.2
industrial 8.2
mountain 8
stalker 7.9
grass 7.9
radioactive 7.8
radiation 7.8
boy 7.8
standing 7.8
destruction 7.8
accident 7.8
toxic 7.8
protective 7.8
mask 7.8
nuclear 7.8
chemical 7.7
gas 7.7
war 7.7
walk 7.6
fun 7.5
city 7.5
photographer 7.5
freedom 7.3
tourist 7.2
suit 7.2
team 7.2
family 7.1
happiness 7

Microsoft
created on 2022-01-16

outdoor 92.8
person 90.5
text 89.4
group 89.3
clothing 85.6
standing 76.9
white 67.8
posing 63.3
black and white 53
team 47
old 46.9
clothes 28.6
line 28.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 74.3%
Calm 85.1%
Sad 6.9%
Angry 4.8%
Confused 1.7%
Disgusted 0.6%
Surprised 0.3%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Female, 77.2%
Calm 99.7%
Happy 0.2%
Confused 0%
Sad 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 93.2%
Fear 3.1%
Sad 1%
Surprised 0.7%
Disgusted 0.6%
Confused 0.5%
Angry 0.5%
Happy 0.4%

AWS Rekognition

Age 48-54
Gender Female, 90.4%
Calm 100%
Happy 0%
Sad 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Female, 53.6%
Calm 99.6%
Happy 0.2%
Sad 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 71%
Calm 99.6%
Happy 0.3%
Confused 0.1%
Disgusted 0%
Surprised 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 99.6%
Happy 69.8%
Calm 7.8%
Sad 7.5%
Angry 4.9%
Fear 3.4%
Confused 2.6%
Disgusted 2.3%
Surprised 1.6%

AWS Rekognition

Age 23-31
Gender Female, 98.7%
Angry 41.1%
Happy 19.4%
Fear 9.6%
Calm 9.4%
Confused 8.6%
Sad 6.3%
Disgusted 3.9%
Surprised 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Text analysis

Amazon

J33

Google

YT3RA2- A
YT3RA2-
A