Human Generated Data

Title

Untitled (women's club making crafts)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19790

Human Generated Data

Title

Untitled (women's club making crafts)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19790

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.5
Person 99.4
Person 99.3
Person 99.2
Person 99.2
Person 99
Person 98.9
Person 98.5
Person 98.4
Person 98.3
Person 97.9
Room 97.4
Indoors 97.4
Person 96.8
Person 94.9
Person 94.6
Workshop 91.2
Classroom 91.2
School 91.2
Crowd 85.9
Audience 77.9
People 77.4
Face 74.9
Furniture 73.9
Table 68.6
Clothing 65.5
Apparel 65.5
Building 65.4
Cafeteria 64.6
Restaurant 64.6
Meal 62.9
Food 62.9
Sitting 60.2
Seminar 55.3
Speech 55.3
Lecture 55.3

Clarifai
created on 2023-10-22

people 99.7
group 99.2
woman 97.7
adult 94.6
group together 94.1
room 94
child 93.2
furniture 93.1
many 92.1
man 92
education 91.1
sit 89.4
administration 88.4
war 87.6
monochrome 86.2
music 84.9
indoors 84.8
recreation 83.5
teacher 82.5
leader 81.2

Imagga
created on 2022-03-05

classroom 54
room 40.7
man 36.2
people 34.5
male 30.5
person 28.1
adult 21.6
table 21.6
men 21.4
business 21.2
group 20.1
happy 19.4
businessman 19.4
women 19
shop 17.7
smiling 17.3
teacher 16.1
counter 16
lifestyle 15.9
indoors 15.8
couple 15.7
sitting 15.4
friends 15
restaurant 14.6
meeting 14.1
suit 12.6
work 12.5
education 12.1
party 12
businesswoman 11.8
office 11.7
mercantile establishment 11.7
worker 11.6
smile 11.4
corporate 11.2
professional 11.2
executive 11.1
chair 10.9
barbershop 10.9
team 10.7
cheerful 10.6
together 10.5
student 10.2
teamwork 10.2
happiness 10.2
communication 10.1
playing 10
center 10
music 9.9
handsome 9.8
job 9.7
interior 9.7
working 9.7
computer 9.6
boy 9.6
standing 9.5
learning 9.4
modern 9.1
hand 9.1
attractive 9.1
holding 9.1
black 9
home 8.8
place of business 8.7
class 8.7
talking 8.5
school 8.5
study 8.4
horizontal 8.4
house 8.3
occupation 8.2
human 8.2
board 8.1
marimba 8
night 8
to 8
teaching 7.8
colleagues 7.8
portrait 7.8
musical instrument 7.7
drinking 7.6
casual 7.6
dinner 7.6
drink 7.5
leisure 7.5
inside 7.4
percussion instrument 7.3
indoor 7.3
love 7.1
life 7

Google
created on 2022-03-05

Photograph 94.3
Black 89.9
Table 87.2
Black-and-white 84.5
Style 84
Monochrome 77.7
Monochrome photography 76.4
Jacket 76.4
T-shirt 74.4
Snapshot 74.3
Event 74.3
Font 69.6
Vintage clothing 68.6
Team 67.6
Room 65.1
History 64.8
Class 64.5
Child 64.2
Recreation 63.4
Cooking 62.2

Microsoft
created on 2022-03-05

text 98.4
person 97.6
indoor 93.4
man 77.1
clothing 76.8
group 71.4
black and white 62
table 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Sad 36.1%
Happy 30.3%
Calm 23.8%
Confused 4.1%
Surprised 1.6%
Disgusted 1.5%
Fear 1.4%
Angry 1.3%

AWS Rekognition

Age 49-57
Gender Male, 99.6%
Calm 49%
Happy 25.1%
Sad 10.3%
Disgusted 6.5%
Confused 3.9%
Fear 2.6%
Angry 1.3%
Surprised 1.2%

AWS Rekognition

Age 42-50
Gender Male, 90.2%
Sad 62%
Happy 23.4%
Calm 5.9%
Confused 3.2%
Disgusted 1.8%
Angry 1.8%
Surprised 1.1%
Fear 0.8%

AWS Rekognition

Age 48-56
Gender Male, 95.4%
Happy 44.6%
Sad 39.2%
Calm 6.8%
Fear 2.4%
Surprised 2.2%
Disgusted 2.1%
Confused 1.8%
Angry 1%

AWS Rekognition

Age 43-51
Gender Male, 99.1%
Happy 57%
Calm 17.2%
Sad 10%
Fear 8.7%
Confused 2.4%
Surprised 1.9%
Disgusted 1.8%
Angry 1%

AWS Rekognition

Age 48-54
Gender Male, 88.5%
Happy 59.3%
Calm 38.1%
Confused 0.8%
Surprised 0.4%
Angry 0.4%
Sad 0.4%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 50-58
Gender Female, 97.3%
Calm 93.9%
Happy 5.8%
Sad 0.1%
Confused 0.1%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Female, 95.4%
Happy 89.3%
Calm 8.5%
Sad 0.9%
Confused 0.4%
Fear 0.3%
Disgusted 0.3%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Happy 45.3%
Surprised 39.3%
Calm 6.1%
Sad 3%
Fear 2%
Confused 1.8%
Angry 1.6%
Disgusted 0.9%

AWS Rekognition

Age 48-54
Gender Female, 94.7%
Happy 87.2%
Calm 10.8%
Sad 1%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 39-47
Gender Male, 79%
Calm 37%
Happy 31.9%
Sad 19.1%
Confused 5.3%
Disgusted 2%
Fear 1.7%
Surprised 1.6%
Angry 1.4%

AWS Rekognition

Age 48-54
Gender Female, 94.4%
Happy 99.6%
Calm 0.2%
Sad 0.1%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.5%
Person 99.4%
Person 99.3%
Person 99.2%
Person 99.2%
Person 99%
Person 98.9%
Person 98.5%
Person 98.4%
Person 98.3%
Person 97.9%
Person 96.8%
Person 94.9%
Person 94.6%