Human Generated Data

Title

Untitled (several elderly women wearing hats sitting in living room during a meeting)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14270

Human Generated Data

Title

Untitled (several elderly women wearing hats sitting in living room during a meeting)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.1
Person 99.1
Person 98.8
Person 98.5
Person 98.2
Person 97.6
Clinic 97.3
Person 97.1
Person 96.2
Room 95.9
Indoors 95.9
Operating Theatre 85.2
Hospital 85.2
Interior Design 73
Person 71.3
People 69.8
Furniture 58.6
Living Room 57.5

Imagga
created on 2022-01-29

interior 38
table 34.6
room 33.9
chair 33.1
bass 29
furniture 25.3
salon 20.5
modern 20.3
indoors 20.2
house 20
indoor 19.2
design 18.6
restaurant 17.7
decor 17.7
window 17.7
home 17.5
shop 17.4
people 16.2
floor 15.8
musical instrument 15.6
wood 15
stringed instrument 14.8
light 14.7
luxury 14.6
man 14.1
style 14.1
architecture 14.1
person 13.6
hall 13.4
business 13.4
musician 12.9
inside 12.9
empty 12.9
men 12.9
music 12.8
male 12.8
glass 12.7
chairs 12.7
guitar 12.7
comfortable 12.4
bowed stringed instrument 12.4
dining 12.4
lifestyle 12.3
group 12.1
barbershop 12
adult 11.9
teacher 11.6
seat 11.3
contemporary 11.3
elegance 10.9
office 10.8
lamp 10.5
building 10.4
classroom 10.3
kitchen 9.9
concert 9.7
apartment 9.6
decoration 9.5
nobody 9.3
dinner 9.3
professional 9
mercantile establishment 8.9
urban 8.7
women 8.7
musical 8.6
instrument 8.4
singer 8.2
plant 8.2
violin 8.2
outfit 8
smiling 8
tables 7.9
life 7.9
wind instrument 7.8
play 7.7
sitting 7.7
elegant 7.7
wall 7.7
sofa 7.7
hotel 7.6
counter 7.6
living 7.6
meeting 7.5
classic 7.4
service 7.4
bar 7.4
occupation 7.3
food 7.2
brass 7.2
cabinet 7.1
work 7.1
day 7.1
wooden 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 96.9
text 93
indoor 86.9
furniture 84.3
clothing 81.6
window 80.1
table 73.8
chair 61.9
man 60.3
people 57.5
dining table 8.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 61.1%
Calm 89.5%
Surprised 2.8%
Happy 2.4%
Confused 2.1%
Sad 1.2%
Fear 0.9%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 41-49
Gender Male, 98%
Calm 92.5%
Happy 2.6%
Sad 2.4%
Surprised 1.4%
Confused 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 16-22
Gender Female, 67.5%
Calm 98.1%
Sad 0.6%
Confused 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 100%
Surprised 0%
Sad 0%
Happy 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people standing in front of a window 90.6%
a group of people in a room 90.5%
a group of people standing in a room 90.4%

Text analysis

Amazon

SULETA
EIEW
DUCCO SULETA EIEW
DUCCO

Google

11001181
11001181