Human Generated Data

Title

Untitled (men working on machinery digging into street)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15427

Human Generated Data

Title

Untitled (men working on machinery digging into street)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15427

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Person 92.8
Vehicle 72.3
Transportation 72.3
Truck 72.1
Outdoors 66.3
Nature 64.6
People 62.1
Text 60.8
Soil 55

Clarifai
created on 2023-10-28

people 99.7
vehicle 99.7
machine 98.6
transportation system 98.4
adult 97.8
grinder 96.5
one 95.9
group 93
war 93
science 92.9
illustration 92.7
aircraft 92.3
container 92
watercraft 91.7
furniture 91.2
no person 90.4
technology 90
machinery 89.8
vintage 89.8
industry 89.6

Imagga
created on 2022-03-05

equipment 28.6
monitor 21.5
device 21.4
computer 20.2
technology 20
electronic equipment 17.1
metal 16.9
business 15.8
work 14.1
television 13.6
machine 13
working 12.4
old 11.1
industry 11.1
screen 10.8
light 10.7
steel 10.6
office 10.6
home appliance 10.3
man 10.1
case 10
men 9.4
electricity 9.4
3d 9.3
hand 9.1
industrial 9.1
appliance 9
digital 8.9
box 8.5
communication 8.4
vintage 8.3
security 8.3
worker 8
apparatus 8
interior 7.9
bright 7.9
engineer 7.8
black 7.8
male 7.8
factory 7.7
laptop 7.7
vehicle 7.7
telephone 7.7
sky 7.6
system 7.6
finance 7.6
call 7.6
energy 7.6
support 7.5
electronic 7.5
room 7.4
safety 7.4
object 7.3
protection 7.3
information 7.1
projector 7.1
safe 7

Microsoft
created on 2022-03-05

text 99.9
book 93.9
sketch 93.4
drawing 91.3
black and white 81
old 80
white 75.9
black 71.9
vintage 43.3
picture frame 8.5

Color Analysis

Feature analysis

Amazon

Person
Truck
Person 99.2%
Person 92.8%
Truck 72.1%

Categories

Imagga

paintings art 98.3%
interior objects 1.1%

Captions

Text analysis

Amazon

PHA
MENT