Human Generated Data

Title

Untitled (group portrait, Brown University)

Date

1885

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2193

Human Generated Data

Title

Untitled (group portrait, Brown University)

People

Artist: Unidentified Artist,

Date

1885

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2193

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 99.4
Person 99.2
Person 98.9
Shoe 97.5
Footwear 97.5
Clothing 97.5
Apparel 97.5
Person 97.5
People 94.9
Shoe 94.5
Person 87.8
Person 86.1
Person 85.8
Person 84
Person 84
Person 76.9
Bush 74.7
Plant 74.7
Vegetation 74.7
Porch 70.1
Shoe 69
Person 66.5
Shoe 63.8
Housing 61.7
Building 61.7
Painting 61.1
Art 61.1
Door 59.1
Family 58.4
Jury 58.3
Meal 57.3
Food 57.3
Person 55.7
Shoe 54.4
Shoe 51.7

Clarifai
created on 2023-10-15

people 100
group 99.6
child 99
woman 98
group together 97.9
many 97.4
adult 97.4
boy 97
man 96.6
leader 94.4
outfit 93.3
uniform 93.1
administration 91.6
portrait 91.3
street 90.9
family 88.6
military 88.2
sepia 87.5
wear 86.9
soldier 86.5

Imagga
created on 2021-12-15

sculpture 50.9
kin 50.1
statue 46.5
architecture 34.5
ancient 32.9
religion 32.3
stone 28.9
history 28.6
art 28.5
culture 27.3
temple 26.9
travel 26.8
old 23.7
building 23
tourism 22.3
monument 21.5
god 19.1
carving 19.1
keepsake 18.7
landmark 18.1
religious 17.8
famous 17.7
city 17.5
historical 16.9
fountain 15.9
world 15.8
column 15.7
historic 15.6
traditional 14.1
church 13.9
figure 13.4
marble 12.6
heritage 12.6
holy 12.5
spirituality 12.5
spiritual 12.5
worship 11.6
antique 11.6
decoration 10.9
statues 9.9
carved 9.8
tourist 9.3
face 9.2
palace 9
people 8.9
meditation 8.6
structure 8.6
museum 8.1
roman 7.9
catholic 7.9
angel 7.8
facade 7.7
faith 7.7
china 7.5
town 7.4
tradition 7.4
peace 7.3
detail 7.2
military uniform 7.2
love 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

clothing 92
person 92
child 72.9
text 68.7
posing 66.6
altar 17.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 98.1%
Calm 98.7%
Angry 0.7%
Sad 0.2%
Surprised 0.2%
Confused 0.1%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 24-38
Gender Male, 95%
Calm 91.5%
Sad 4%
Angry 2.9%
Fear 0.9%
Surprised 0.2%
Confused 0.2%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Male, 96.1%
Calm 84.1%
Angry 9.8%
Fear 2.3%
Sad 2.1%
Disgusted 0.7%
Surprised 0.6%
Happy 0.3%
Confused 0.1%

AWS Rekognition

Age 23-35
Gender Male, 96.2%
Calm 97.5%
Angry 0.7%
Sad 0.7%
Happy 0.6%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 23-37
Gender Male, 98.5%
Calm 93.3%
Sad 3.3%
Angry 1.7%
Confused 0.8%
Happy 0.3%
Fear 0.3%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 24-38
Gender Male, 96.1%
Calm 97.5%
Angry 0.8%
Sad 0.6%
Fear 0.5%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 23-37
Gender Male, 97.1%
Calm 82%
Sad 10.1%
Confused 3.9%
Angry 1.7%
Fear 1.3%
Surprised 0.6%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Male, 97.1%
Calm 91.6%
Fear 2.6%
Sad 2.5%
Angry 1.7%
Happy 0.7%
Confused 0.4%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 29-45
Gender Male, 93.2%
Confused 60.9%
Calm 18.2%
Angry 5.6%
Fear 4.9%
Surprised 4%
Sad 3.4%
Disgusted 2.7%
Happy 0.2%

AWS Rekognition

Age 42-60
Gender Male, 81.9%
Calm 97.8%
Sad 1%
Angry 0.4%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 24-38
Gender Male, 92.7%
Calm 96.3%
Sad 2.6%
Angry 0.5%
Fear 0.3%
Confused 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 21-33
Gender Male, 96.4%
Calm 68.7%
Sad 29.2%
Angry 1.7%
Surprised 0.2%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 23-35
Gender Male, 89.8%
Calm 99.8%
Sad 0.1%
Angry 0.1%
Fear 0%
Happy 0%
Surprised 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 28-44
Gender Male, 96.8%
Calm 84.3%
Angry 8.4%
Sad 5.5%
Confused 0.9%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 20-32
Gender Male, 90.9%
Calm 78.5%
Sad 18.8%
Angry 1.3%
Fear 0.9%
Confused 0.2%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 20-32
Gender Male, 94.4%
Angry 92.6%
Calm 5.2%
Fear 1.3%
Sad 0.4%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 33-49
Gender Male, 93.2%
Calm 90.9%
Angry 6%
Sad 1.5%
Happy 0.7%
Surprised 0.3%
Fear 0.3%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Male, 96%
Calm 98.2%
Sad 1.2%
Angry 0.3%
Fear 0.1%
Happy 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 36-52
Gender Male, 94.7%
Calm 91.3%
Sad 5.1%
Happy 2%
Angry 0.6%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 22-34
Gender Male, 99.9%
Calm 91.9%
Sad 5.9%
Angry 1.1%
Happy 0.4%
Confused 0.3%
Fear 0.2%
Surprised 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 97.5%
Painting 61.1%