Human Generated Data

Title

Untitled (two people sleeping on train seats)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15369

Human Generated Data

Title

Untitled (two people sleeping on train seats)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15369

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.9
Couch 96.3
Chair 92.4
Armchair 89.5
Person 89
Human 89
Bed 88.9
Cushion 80.4
Cat 77
Animal 77
Mammal 77
Pet 77
Car Seat 61.7
Canine 60.4

Clarifai
created on 2023-10-28

monochrome 99.9
people 99.3
adult 96.2
man 95.8
car 91.7
reclining 90.2
woman 89.4
seat 88.9
wedding 88.8
street 88.4
one 88.1
sleep 87.6
chair 87.4
furniture 86.6
vehicle 86.4
indoors 85.1
wear 84.3
sitting 83.7
travel 82.7
transportation system 82.3

Imagga
created on 2022-03-05

seat 100
support 91.1
car seat 67
device 65.7
plane seat 52.1
rest 30.5
car 26.6
headrest 20.1
interior 19.4
modern 16.8
adult 16.8
vehicle 16.5
people 16.2
transportation 16.1
person 15.4
inside 14.7
armrest 14.3
business 14
sitting 13.7
auto 13.4
sofa 13
home 12.8
leather 12.3
indoors 12.3
drive 12.3
lifestyle 12.3
office 12
technology 11.9
driver 11.7
armchair 11.6
smiling 11.6
automobile 11.5
happy 11.3
attractive 11.2
laptop 11.2
transport 11
care 10.7
driving 10.6
pretty 10.5
wheel 10.4
black 10.2
work 10.2
room 10
travel 9.9
human 9.7
new 9.7
computer 9.7
comfortable 9.5
women 9.5
man 9.4
face 9.2
child 9.2
design 9
comfort 8.7
mirror 8.7
elegant 8.6
smile 8.5
equipment 8.5
chair 8.4
indoor 8.2
light 8
upholstery 8
working 7.9
businessman 7.9
cute 7.9
male 7.8
hand 7.6
power 7.6
lying 7.5
one 7.5
holding 7.4
close 7.4
businesswoman 7.3
suit 7.2
body 7.2
portrait 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.5
cat 96.1
carnivore 89.3
black and white 81.7
furniture 71.5
animal 67.1
bed 51.2
clothes 32.5

Color Analysis

Feature analysis

Amazon

Person
Bed
Cat
Person 89%
Bed 88.9%
Cat 77%

Categories

Imagga

interior objects 79.8%
paintings art 16.9%
food drinks 1.7%

Captions

Text analysis

Amazon

JOJ
KODAK-A

Google

YT37A2- XAGON
YT37A2-
XAGON