Human Generated Data

Title

Untitled (woman with four children seated in living room, portrait of man on wall above)

Date

1940s

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12407

Human Generated Data

Title

Untitled (woman with four children seated in living room, portrait of man on wall above)

People

Artist: Paul Gittings, American 1900 - 1988

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12407

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.9
Human 99.9
Person 99.8
Person 99.1
Person 98.6
Person 97.4
Person 97.2
Furniture 90.7
People 89.5
Helmet 88
Clothing 88
Apparel 88
Art 73.7
Indoors 69.8
Room 64.2
Photography 63.1
Photo 63.1
Portrait 62.6
Face 62.6
Drawing 61.1
Linen 59.9
Home Decor 59.9
Living Room 57.5
Table 56.7
Family 55.5

Clarifai
created on 2023-10-27

people 100
group 99.9
leader 98.7
adult 98.2
furniture 98
seat 96
three 95.8
offspring 95.6
group together 95.6
several 95.4
man 94.9
child 94.2
two 93.9
administration 93.9
chair 93.7
room 93.3
woman 93
family 92.5
sit 92.1
many 91.6

Imagga
created on 2022-02-04

blackboard 60.7
room 19.5
man 17.5
people 15.6
person 15.4
male 14.9
interior 14.1
wall 12.8
old 12.5
musical instrument 12.2
building 12.1
classroom 11.7
house 11.7
vintage 11.6
grunge 11.1
decoration 11
business 10.9
retro 10.6
businessman 10.6
men 10.3
black 10.2
home 9.6
drawing 9.5
child 9.5
education 9.5
antique 9.5
art 9.4
youth 9.4
chair 9.2
portrait 9
school 8.9
style 8.9
family 8.9
boy 8.7
ancient 8.6
holiday 8.6
design 8.4
window 7.9
couple 7.8
architecture 7.8
modern 7.7
percussion instrument 7.6
furniture 7.6
equipment 7.5
student 7.5
cheerful 7.3
indoor 7.3
computer 7.3
teacher 7.2
to 7.1
happiness 7
travel 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 99.1
person 97
clothing 95.6
man 94.4
old 73.6
several 13.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 76.9%
Happy 94.6%
Calm 5%
Sad 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0%
Confused 0%

Feature analysis

Amazon

Person
Helmet
Person 99.9%

Text analysis

Amazon

DAL
5
2
7
DAL 5 2 G
2612
G
7 2612 go
go
Korea
Keter

Google

DAL 5. 1 2 5 YTTAP HAMT2A3
DAL
5.
1
2
5
YTTAP
HAMT2A3