Human Generated Data

Title

Untitled (two women sleeping on train seats)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15364

Human Generated Data

Title

Untitled (two women sleeping on train seats)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15364

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Couch 99.6
Chair 96.5
Person 95.3
Human 95.3
Face 89.4
Cushion 88.8
Person 88.7
Female 85.6
Clothing 75.4
Apparel 75.4
Person 72.3
Indoors 71
Pillow 70.2
Girl 69.2
Portrait 68.7
Photography 68.7
Photo 68.7
Sunglasses 65.2
Accessories 65.2
Accessory 65.2
Woman 62.1
Kid 57.9
Child 57.9
Room 56.8
Electronics 56.1
Pc 55.7
Computer 55.7
Living Room 55.4
Meal 55.3
Food 55.3

Clarifai
created on 2023-10-29

people 99.8
monochrome 99.7
woman 98.6
adult 98.3
man 97.1
indoors 94.6
portrait 93.9
room 93.6
one 93.2
sit 93.1
seat 93
girl 92.7
chair 89.9
family 89.9
two 88.8
music 88.8
reclining 88.5
bed 86.8
couple 85.6
sleep 84

Imagga
created on 2022-03-05

car 54.2
vehicle 34.8
automobile 28.7
transportation 26.9
device 25.5
driver 25.2
auto 24.9
seat 21.9
wheel 19.9
drive 19.8
transport 19.2
man 18.1
driving 16.4
luxury 15.4
person 15.2
male 14.9
support 14.6
interior 14.1
modern 14
windshield 12.9
adult 12.9
speed 12.8
sitting 12
inside 12
power 11.7
people 11.7
motor 11.6
protective covering 11.3
road 10.8
technology 10.4
business 10.3
black 10.2
headlight 10
machine 9.9
hand 9.9
strainer 9.7
new 9.7
engine 9.6
screen 9.5
fast 9.3
window 9.2
travel 9.1
vessel 9
mask 8.9
smiling 8.7
expensive 8.6
traffic 8.5
cockpit 8.5
portrait 8.4
fashion 8.3
tire 8.1
filter 8.1
metal 8
looking 8
work 7.8
face 7.8
glass 7.8
men 7.7
highway 7.7
mirror 7.7
covering 7.6
rest 7.5
happy 7.5
close 7.4
protection 7.3
suit 7.2
control 7.2
home appliance 7.1

Microsoft
created on 2022-03-05

indoor 94.7
text 88.8
window 86.3
black and white 86.3
person 85.2
white goods 83.3
open 48.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 53.4%
Calm 99.9%
Surprised 0%
Sad 0%
Disgusted 0%
Angry 0%
Happy 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 98.1%
Calm 89.6%
Sad 8.8%
Confused 0.5%
Angry 0.3%
Surprised 0.3%
Disgusted 0.3%
Happy 0.2%
Fear 0.2%

Feature analysis

Amazon

Person
Sunglasses
Person 95.3%
Person 88.7%
Person 72.3%
Sunglasses 65.2%

Categories

Imagga

paintings art 91.1%
interior objects 6.7%

Text analysis

Amazon

TOJ
K4O
K4O Day
Day