Human Generated Data

Title

Untitled (elderly women at DAR meeting)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15642

Human Generated Data

Title

Untitled (elderly women at DAR meeting)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15642

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 97.5
Human 97.5
Person 97
Person 91.2
Poster 88
Advertisement 88
Person 81
Clinic 80.6
Doctor 76.9
People 65.1
Indoors 62.3
Person 61.2
Hairdresser 59
Worker 59
Room 58.7
Photography 58.1
Photo 58.1

Clarifai
created on 2023-10-28

people 99.8
adult 99.1
woman 98.8
man 97.8
group 97.1
sit 96.6
monochrome 96
chair 94.9
two 93.1
medical practitioner 92
hospital 90.2
indoors 88.4
room 87.9
furniture 86.4
three 84.2
sitting 83.1
education 82.9
desk 78.4
seat 77.1
healthcare 75.6

Imagga
created on 2022-02-05

home 25.5
house 21.7
interior 20.3
room 17.9
people 17.3
person 16.1
design 15.2
modern 14.7
sketch 14.6
drawing 14
window 12.7
man 12.1
indoor 11.9
art 11.5
indoors 11.4
newspaper 11.2
style 11.1
domestic 11.1
adult 11.1
black 10.8
furniture 10.7
product 10.5
vintage 9.9
office 9.8
retro 9.8
new 9.7
decoration 9.5
work 9.4
film 9.4
wall 9.4
grunge 9.4
creation 9.2
inside 9.2
business 9.1
comic book 9
kitchen 8.9
paper 8.8
luxury 8.6
elegant 8.6
shop 8.4
salon 8.3
negative 8.3
fashion 8.3
building 8.2
refrigerator 7.9
representation 7.8
architecture 7.8
flower 7.7
frame 7.6
pattern 7.5
sexy 7.2
worker 7.1
male 7.1
kid 7.1
working 7.1
berry 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.6
window 89.4
person 79.2
clothing 78.9
drawing 71.4
cartoon 60.5
old 53.6
posing 35.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Female, 58.1%
Calm 99.9%
Sad 0%
Surprised 0%
Angry 0%
Confused 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 53-61
Gender Female, 54.4%
Calm 87%
Sad 8.1%
Confused 3%
Disgusted 0.8%
Angry 0.3%
Fear 0.3%
Happy 0.3%
Surprised 0.2%

AWS Rekognition

Age 37-45
Gender Male, 64.8%
Angry 34.7%
Fear 15.5%
Surprised 14.9%
Sad 9.8%
Happy 8.6%
Calm 7%
Disgusted 5.9%
Confused 3.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Poster
Person 97.5%
Person 97%
Person 91.2%
Person 81%
Person 61.2%
Poster 88%

Categories

Imagga

paintings art 99%