Human Generated Data

Title

Untitled (women at Women's Club meeting)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16206

Human Generated Data

Title

Untitled (women at Women's Club meeting)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16206

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.4
Human 99.4
Person 97.8
Sitting 95.7
Person 93.5
Clothing 84
Apparel 84
Person 83.2
Furniture 77.1
Flooring 75.6
Chair 73
Floor 62.5
Person 62.4
Portrait 56.3
Face 56.3
Photography 56.3
Photo 56.3
LCD Screen 55.5
Electronics 55.5
Screen 55.5
Monitor 55.5
Display 55.5
Overcoat 55.2
Coat 55.2

Clarifai
created on 2023-10-28

people 99.9
two 98.7
man 98.4
woman 98.3
adult 97.9
wedding 96.3
group 96.2
room 96.1
indoors 95.9
furniture 93.1
chair 93
bride 92.8
veil 92.6
three 92.5
family 92.5
home 89.7
groom 85.9
group together 84.4
wear 83.2
sit 81.7

Imagga
created on 2022-02-05

man 30.9
shop 29.7
people 29.6
home 29.5
salon 27.8
room 26.8
person 25
barbershop 24.9
male 24.2
adult 23.6
indoors 22.8
interior 22.1
teacher 19.5
mercantile establishment 18.4
smiling 18.1
professional 16.6
indoor 16.4
business 15.8
couple 15.7
house 15
happy 15
women 15
educator 13.8
sitting 12.9
businessman 12.4
office 12.3
lifestyle 12.3
place of business 12.2
smile 12.1
men 12
window 12
inside 12
happiness 11.8
new 11.3
chair 11.2
two 11
work 11
family 10.7
life 10.6
lady 10.5
modern 10.5
holding 9.9
cheerful 9.8
portrait 9.7
standing 9.6
clothing 9.5
alone 9.1
worker 9
furniture 8.7
moving 8.6
mature 8.4
classroom 8.3
fashion 8.3
girls 8.2
looking 8
mother 7.8
pretty 7.7
togetherness 7.6
clothes 7.5
style 7.4
newspaper 7.4
color 7.2
dress 7.2
kitchen 7.2
to 7.1
working 7.1
day 7.1
hairdresser 7
together 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 98.7
text 98.2
clothing 94.5
man 90.3
furniture 82.1
black and white 75.3
footwear 71.6
woman 67.2
table 62.2
chair 57.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 94.1%
Calm 99.7%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%
Sad 0%
Confused 0%
Happy 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Female, 97.2%
Happy 56.6%
Calm 34.6%
Sad 4.1%
Surprised 1.4%
Fear 1.3%
Confused 0.8%
Angry 0.7%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 97.8%
Person 93.5%
Person 83.2%
Person 62.4%

Categories