Human Generated Data

Title

Square dance, Skyline Farms, Alabama

Date

1937

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3113

Human Generated Data

Title

Square dance, Skyline Farms, Alabama

People

Artist: Ben Shahn, American 1898 - 1969

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3113

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.9
Human 99.9
Person 99.6
Person 99.5
Person 98.9
Person 97.6
Person 94.6
Person 91.2
Person 91.2
Shoe 91
Footwear 91
Clothing 91
Apparel 91
Dance Pose 84.4
Leisure Activities 84.4
Interior Design 82.6
Indoors 82.6
Person 82
Person 80.8
Person 80.3
Person 77.5
People 75.9
Dance 64.5
Person 62.4
Face 62.1
Pants 61.2
Shorts 55.6

Clarifai
created on 2023-10-15

people 99.9
adult 98
group together 97.5
man 96.4
group 95.5
woman 95.4
many 94.8
recreation 91.6
wear 91.5
child 90
monochrome 89.1
music 87.5
two 87.3
athlete 86.9
boy 85.6
musician 82.5
sports equipment 79
outfit 78.4
three 77.4
several 77.1

Imagga
created on 2021-12-15

people 26.2
person 25.3
man 24.8
male 21.5
outdoors 21
uniform 19.5
adult 18.2
child 14.8
military uniform 14.3
clothing 13.9
couple 13.9
outdoor 13.7
men 13.7
lifestyle 13.7
happy 11.9
love 11.8
happiness 11.7
leisure 11.6
sport 11.1
day 11
old 10.4
summer 10.3
outside 10.3
sky 10.2
life 10
sunset 9.9
travel 9.8
grass 9.5
parent 9.4
weapon 9.3
city 9.1
portrait 9.1
vacation 9
fun 9
family 8.9
together 8.8
women 8.7
smiling 8.7
smile 8.5
walking 8.5
two 8.5
senior 8.4
beach 8.4
park 8.2
mother 8.2
lady 8.1
activity 8.1
boy 7.8
black 7.8
war 7.7
active 7.7
youth 7.7
world 7.6
silhouette 7.4
tourism 7.4
father 7.2
looking 7.2
history 7.2
handsome 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98
person 97.6
clothing 97
man 67.1
black and white 66.2
woman 59.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Male, 98.4%
Calm 65.2%
Angry 27%
Sad 3.5%
Disgusted 1.2%
Happy 1%
Confused 0.8%
Fear 0.7%
Surprised 0.6%

AWS Rekognition

Age 22-34
Gender Female, 97.2%
Happy 99.4%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0.1%
Calm 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 35-51
Gender Male, 79.4%
Sad 39.2%
Calm 37.4%
Happy 7.8%
Surprised 5.3%
Confused 3.8%
Angry 3.2%
Fear 2.7%
Disgusted 0.6%

AWS Rekognition

Age 32-48
Gender Male, 95.1%
Calm 29.3%
Angry 24.1%
Sad 20.8%
Confused 9.5%
Surprised 7.2%
Happy 3.4%
Fear 3.1%
Disgusted 2.6%

AWS Rekognition

Age 32-48
Gender Male, 59.3%
Calm 69%
Sad 14.9%
Angry 6.8%
Disgusted 3.6%
Confused 2.7%
Happy 1.7%
Surprised 0.8%
Fear 0.5%

AWS Rekognition

Age 41-59
Gender Male, 94.3%
Sad 93.3%
Calm 4.2%
Angry 0.8%
Happy 0.7%
Confused 0.5%
Disgusted 0.3%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 38-56
Gender Male, 90.9%
Calm 87.5%
Sad 8.8%
Angry 1.9%
Confused 0.7%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 18-30
Gender Male, 71.5%
Angry 26.7%
Sad 25.3%
Calm 22.1%
Happy 9.1%
Disgusted 7.1%
Confused 3.9%
Fear 3.4%
Surprised 2.4%

AWS Rekognition

Age 34-50
Gender Female, 70.9%
Disgusted 42.6%
Calm 38%
Surprised 6.1%
Happy 4.1%
Sad 3.5%
Angry 3%
Fear 1.5%
Confused 1.4%

AWS Rekognition

Age 29-45
Gender Male, 68.8%
Happy 38.2%
Calm 36.7%
Sad 21.6%
Angry 1.5%
Confused 1%
Fear 0.4%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 34-50
Gender Male, 82.4%
Sad 59.6%
Calm 34.8%
Happy 2.5%
Confused 1.4%
Angry 1%
Fear 0.4%
Surprised 0.2%
Disgusted 0.2%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.9%
Shoe 91%