Human Generated Data

Title

Untitled (picnic, people at long tables)

Date

1957

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19827

Human Generated Data

Title

Untitled (picnic, people at long tables)

People

Artist: Ken Whitmire Associates, American

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19827

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.3
Human 98.3
Person 94.8
Crowd 94.7
Person 94.4
Person 92.9
Pedestrian 92.8
Person 91
Person 90.6
Person 88.7
Person 85.9
Marching 82.4
People 74.7
Person 72.5
Person 71.5
Funeral 71.4
Parade 66.5
Person 64.8
Person 62.5
Person 62
Clothing 59.9
Apparel 59.9
Person 44

Clarifai
created on 2023-10-22

people 99.9
many 99.7
group together 99.3
group 98.4
crowd 96.7
man 96.2
adult 94.4
woman 93.5
war 92.7
military 91.6
leader 90.8
administration 88.7
soldier 87.3
wear 84.7
furniture 84.1
recreation 83.5
audience 81.6
one 80.9
education 79.8
seat 79.4

Imagga
created on 2022-03-05

sax 21.2
winter 15.3
tree 15
black 14.6
season 14
pattern 13.7
texture 13.2
pine 12.5
decoration 12.3
holiday 11.5
grunge 11.1
city 10.8
surface 10.6
seasonal 10.5
snow 10.2
man 9.7
detail 9.6
sky 9.6
design 9.6
wind instrument 9.5
close 9.1
outdoors 8.5
art 8.4
musical instrument 8.2
dirty 8.1
clothing 7.9
textured 7.9
people 7.8
male 7.8
color 7.8
cold 7.7
person 7.6
sculpture 7.6
traditional 7.5
tradition 7.4
device 7.3
paint 7.2
aged 7.2
world 7.2
fan 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.9
outdoor 96.3
black and white 91.3
person 80.4
people 79
crowd 42.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 99.5%
Calm 94.3%
Sad 2%
Confused 1.1%
Fear 1%
Disgusted 0.8%
Surprised 0.5%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 20-28
Gender Female, 56.2%
Calm 59.7%
Sad 21%
Fear 10.4%
Happy 4.2%
Disgusted 2.1%
Angry 1.3%
Surprised 0.8%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Female, 57%
Calm 99.9%
Sad 0%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Male, 97.6%
Fear 84%
Calm 9.6%
Sad 3%
Happy 1.1%
Disgusted 0.8%
Confused 0.7%
Surprised 0.4%
Angry 0.4%

AWS Rekognition

Age 21-29
Gender Female, 57.9%
Sad 85.6%
Calm 8.5%
Fear 1.7%
Happy 1.5%
Confused 1.5%
Disgusted 0.4%
Surprised 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%
Person 94.8%
Person 94.4%
Person 92.9%
Person 91%
Person 90.6%
Person 88.7%
Person 85.9%
Person 72.5%
Person 71.5%
Person 64.8%
Person 62.5%
Person 62%
Person 44%

Categories

Text analysis

Amazon

NAGOY
MUR NAGOY
MUR