Human Generated Data

Title

Untitled (grop portrait of Yakima Moose Association, women's club)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19799

Human Generated Data

Title

Untitled (grop portrait of Yakima Moose Association, women's club)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Person 99.5
Human 99.5
Dining Table 99.5
Table 99.5
Person 99.4
Person 99.3
Person 98.9
Tabletop 98.9
Person 98.8
Person 98.7
Person 98.6
Chair 98.4
Restaurant 98.2
Person 97.8
Room 95.4
Indoors 95.4
Dining Room 95.4
Person 92.8
Meal 91.9
Food 91.9
Person 91.8
Clothing 89.4
Apparel 89.4
Dish 86.4
Person 84.3
Face 81.6
Cafeteria 78
Cafe 76
People 73.3
Suit 73.1
Overcoat 73.1
Coat 73.1
Sitting 64.6
Photo 64.1
Photography 64.1
Portrait 63.8
Female 63
Chair 60.9
Girl 58.1
Child 57
Kid 57

Imagga
created on 2022-03-05

classroom 39.1
man 32.9
room 32.5
brass 31.3
people 29.5
blackboard 29.1
male 27.6
person 27.2
wind instrument 27.1
musical instrument 25
adult 22.8
business 21.8
group 21.7
kin 21.7
businessman 20.3
men 18.9
chair 18.8
teacher 18.5
education 18.2
happy 16.9
women 16.6
student 15.9
school 15.8
sitting 15.4
couple 14.8
smiling 14.5
indoors 14
office 13.6
home 13.5
professional 13.4
interior 13.3
table 13.1
cheerful 13
work 12.5
class 12.5
lifestyle 12.3
together 12.2
meeting 12.2
friends 12.2
study 12.1
corporate 12
executive 12
businesswoman 11.8
board 11.7
portrait 11.6
team 11.6
holding 11.5
boy 11.3
indoor 10.9
worker 10.9
child 10.8
job 10.6
studying 10.5
teamwork 10.2
casual 10.2
communication 10.1
modern 9.8
trombone 9.8
students 9.7
businesspeople 9.5
learning 9.4
two 9.3
smile 9.3
attractive 9.1
black 9
restaurant 8.9
kid 8.9
standing 8.7
wine 8.3
girls 8.2
family 8
looking 8
to 8
happiness 7.8
glass 7.8
wall 7.7
drinking 7.6
reading 7.6
enjoying 7.6
house 7.5
drink 7.5
life 7.5
friendship 7.5
manager 7.4
laptop 7.3
confident 7.3
percussion instrument 7.2
employee 7.1
handsome 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.8
table 97
chair 95.5
furniture 93.2
clothing 80.6
text 78.1
posing 74.1
group 74
man 52.3

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 96.5%
Calm 35.3%
Sad 29.1%
Confused 11.2%
Surprised 6.6%
Fear 5.7%
Happy 5.3%
Angry 3.9%
Disgusted 2.9%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Happy 31.2%
Sad 27.5%
Calm 19.1%
Confused 10.2%
Disgusted 4%
Surprised 3.9%
Fear 2.2%
Angry 1.9%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Calm 75.1%
Happy 20.6%
Disgusted 1.4%
Angry 1%
Confused 0.7%
Sad 0.6%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Male, 97.7%
Calm 40.7%
Sad 36%
Confused 9.4%
Disgusted 6.3%
Fear 2.7%
Happy 2.1%
Angry 1.7%
Surprised 1.1%

AWS Rekognition

Age 31-41
Gender Female, 58.9%
Calm 65%
Sad 15.4%
Happy 7%
Confused 5.6%
Surprised 2.9%
Disgusted 1.8%
Angry 1.2%
Fear 1%

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 99.3%
Happy 0.4%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 99.1%
Calm 84.6%
Sad 8.4%
Confused 2.2%
Surprised 1.7%
Disgusted 1.6%
Angry 0.6%
Happy 0.6%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Male, 99.2%
Calm 84.5%
Happy 6.5%
Sad 2.6%
Confused 2.3%
Surprised 2%
Disgusted 1.1%
Fear 0.7%
Angry 0.5%

AWS Rekognition

Age 45-51
Gender Male, 82.1%
Calm 52.2%
Happy 43%
Confused 1.4%
Surprised 1.1%
Disgusted 1%
Fear 0.5%
Sad 0.4%
Angry 0.3%

AWS Rekognition

Age 49-57
Gender Female, 99.3%
Happy 79.8%
Calm 17.3%
Disgusted 0.9%
Sad 0.8%
Surprised 0.5%
Angry 0.3%
Fear 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 98.4%

Captions

Microsoft

a group of people posing for a photo 95.5%
a group of people posing for the camera 95.4%
a group of people posing for a picture 95.3%

Text analysis

Amazon

1500H
NOKER

Google

C:
C: