Human Generated Data

Title

Untitled (women having tea at Women's Club meeting)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16201

Human Generated Data

Title

Untitled (women having tea at Women's Club meeting)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16201

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 96.5
Human 96.5
Person 94.2
Person 92.5
Person 84.7
Meal 81.5
Food 81.5
Clothing 77.3
Apparel 77.3
Poster 69.1
Advertisement 69.1
Icing 65
Dessert 65
Cake 65
Cream 65
Creme 65
Dish 62.6
Hair 60
Room 59.7
Indoors 59.7
Collage 57.5
Suit 55.2
Coat 55.2
Overcoat 55.2

Clarifai
created on 2023-10-29

people 99.8
adult 98.6
man 97.5
group 96.3
two 95.1
woman 94.9
monochrome 94.1
indoors 92.6
three 89.5
sit 84.4
group together 84.2
chair 84.1
furniture 84
dressing room 83.7
mirror 82.8
room 82.6
wear 82.2
actress 80.9
scientist 79.7
facial expression 78.4

Imagga
created on 2022-02-05

salon 100
people 39
man 36.3
person 29.2
hairdresser 28.5
shop 27.7
barbershop 27
male 26.2
adult 24.1
room 24.1
indoors 22
home 21.5
women 20.5
smiling 19.5
men 18.9
happy 18.2
medical 17.6
professional 17.2
mercantile establishment 16.8
couple 16.5
work 16.5
happiness 16.4
sitting 16.3
doctor 16
lifestyle 15.9
two 15.2
interior 15
holding 14.8
patient 14.6
indoor 14.6
office 14.6
hospital 14.4
table 13.8
smile 13.5
clinic 13.2
medicine 13.2
health 13.2
cheerful 13
worker 12.6
working 12.4
portrait 12.3
senior 12.2
house 11.7
kitchen 11.6
business 11.5
place of business 11.2
looking 11.2
mature 11.2
care 10.7
family 10.7
job 10.6
businessman 10.6
meeting 10.4
mother 10.2
casual 10.2
horizontal 10
nurse 9.8
together 9.6
life 9.6
togetherness 9.4
occupation 9.2
chair 8.8
two people 8.7
standing 8.7
mid adult 8.7
love 8.7
desk 8.5
modern 8.4
instrument 8.4
color 8.3
fashion 8.3
inside 8.3
human 8.2
style 8.2
lady 8.1
domestic 8.1
team 8.1
group 8.1
to 8
coat 8
restaurant 7.9
brunette 7.8
surgery 7.8
party 7.7
profession 7.7
husband 7.6
adults 7.6
communication 7.6
equipment 7.3
holiday 7.2
face 7.1
surgeon 7.1
day 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 94.2
indoor 91.2
text 88.8
wedding dress 74.3
clothing 73.7
vase 68.3
flower 67.3
wedding 60.3
woman 52.8
bride 52.5
table 52.1
old 40.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 99.9%
Calm 73.4%
Happy 14.8%
Surprised 5.2%
Disgusted 2.2%
Fear 1.5%
Sad 1.3%
Confused 0.9%
Angry 0.7%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Surprised 81.6%
Calm 13.8%
Disgusted 1.4%
Angry 1.1%
Sad 0.9%
Fear 0.7%
Happy 0.3%
Confused 0.2%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Surprised 61.2%
Calm 26.2%
Fear 4.1%
Sad 2.6%
Happy 2.2%
Confused 2.1%
Disgusted 0.9%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 96.5%
Person 94.2%
Person 92.5%
Person 84.7%

Categories

Imagga

paintings art 90.4%
people portraits 6.8%
food drinks 1.4%

Text analysis

Amazon

not