Human Generated Data

Title

Untitled (four women sitting in chairs)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20236

Human Generated Data

Title

Untitled (four women sitting in chairs)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20236

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Chair 98.6
Furniture 98.6
Person 98.4
Person 98.2
Clothing 97.1
Apparel 97.1
Floor 87.9
Person 86.9
Flooring 83.3
Shorts 73.2
Female 68.9
People 67.7
Photography 65.5
Photo 65.5
Portrait 64.4
Face 64.4
Sitting 63.6
Girl 63.2
Indoors 62.7
Room 58.9

Clarifai
created on 2023-10-22

people 99.9
group 98.7
group together 97.3
furniture 96.3
wear 96.2
woman 96.1
man 95.6
adult 95.6
child 95.3
room 95.3
monochrome 94.2
recreation 91.2
street 88
family 87.5
music 87.3
several 87
indoors 85.7
home 85
boy 82
art 81.1

Imagga
created on 2022-03-05

dancer 34.5
person 28.1
performer 27.1
people 24
adult 20.7
musical instrument 19.9
entertainer 19.1
accordion 17.6
silhouette 16.6
portrait 16.2
man 16.1
keyboard instrument 15.8
teacher 14.4
fashion 14.3
posing 14.2
life 13.7
body 13.6
male 13.5
fountain 13.1
men 12.9
sexy 12.9
human 12.7
women 12.7
city 12.5
model 12.4
urban 12.2
business 12.1
group 12.1
educator 11.8
professional 11.8
art 11.6
sport 11.5
structure 11.4
wind instrument 11.4
style 11.1
kin 11.1
exercise 10.9
black 10.8
picket fence 10.6
businessman 10.6
lady 10.6
motion 10.3
happy 10
pose 10
leisure 10
dress 9.9
activity 9.9
attractive 9.8
walking 9.5
sitting 9.5
lifestyle 9.4
world 9.2
dark 9.2
hair 8.7
scene 8.7
wall 8.6
grunge 8.5
pretty 8.4
relaxation 8.4
old 8.4
fence 8.3
fitness 8.1
water 8
design 7.9
work 7.9
face 7.8
summer 7.7
modern 7.7
elegant 7.7
move 7.7
balance 7.6
window 7.5
vintage 7.4
action 7.4
barrier 7.4
blond 7.3
sensual 7.3
sun 7.2
success 7.2
dirty 7.2
sunset 7.2
cool 7.1
interior 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

floor 96
text 92.7
person 90.9
clothing 90
furniture 72.7
footwear 66.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 83.9%
Surprised 7%
Confused 3.6%
Happy 2.7%
Sad 2%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 52-60
Gender Female, 75.6%
Happy 77.7%
Sad 8.7%
Calm 6.1%
Confused 4.5%
Angry 1.1%
Disgusted 0.8%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.3%
Person 98.4%
Person 98.2%
Person 86.9%
Chair 98.6%

Categories

Text analysis

Amazon

II3
VT77AP

Google

YAGON
YAGON