Human Generated Data

Title

Untitled (mill worker with table saw)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18432

Human Generated Data

Title

Untitled (mill worker with table saw)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.3
Human 99.3
Workshop 98.6
Wood 92
Carpenter 91.9
Plywood 88.4
Factory 81.5
Building 81.5
Forge 78
Worker 73.9
Apparel 70.6
Clothing 70.6
Hardhat 60.4
Helmet 60.4
Shelf 58.2

Imagga
created on 2022-03-04

percussion instrument 81
musical instrument 61.4
marimba 57.9
vibraphone 32.6
work 31.4
tool 28.9
construction 26.5
merchandise 23.2
worker 23.1
wood 21.7
man 20.8
device 20.4
male 19.1
equipment 18.9
industry 17.9
wooden 17.6
working 16.8
tools 16.1
job 15
repair 14.4
home 14.3
person 14.2
carpenter 13.8
occupation 13.7
professional 13.5
closeup 13.5
people 13.4
indoors 13.2
hand 12.9
glockenspiel 12.8
business 12.7
steel 12.6
build 12.3
metal 12.1
carpentry 11.7
labor 11.7
cut 11.6
old 11.1
industrial 10.9
builder 10.8
hammer 10.7
instrument 10.6
building 10.3
screwdriver 9.9
renovation 9.8
fix 9.7
interior 9.7
close 9.7
adult 9.7
hands 9.5
craft 9.5
table 9.5
men 9.4
inside 9.2
hand tool 9
machine 8.9
woodwork 8.9
manual 8.8
lifestyle 8.7
profession 8.6
house 8.3
rifle 8.2
board 8.1
object 8.1
office 8
factory 7.9
improvement 7.7
gun 7.6
floor 7.4
technology 7.4
indoor 7.3
portrait 7.1
to 7.1

Google
created on 2022-03-04

Shirt 94.3
Black 89.5
Wood 85.4
Black-and-white 83
Publication 77.3
Monochrome photography 73.1
Monochrome 71.9
Metal 71
Tradesman 70.6
Machine 67.7
Room 65.4
Hardwood 63.6
Steel 62.2
Workwear 62.2
Flooring 61.9
Blue-collar worker 58.8
Job 55.5
Pipe 55.2
Factory 55
Book 51.8

Microsoft
created on 2022-03-04

man 98
text 95.8
person 94.3
black and white 93.4

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.7%
Calm 99.1%
Sad 0.7%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a man sitting on a bench 31.7%

Text analysis

Amazon

$8
Jul
YТЭА-

Google

YT37A2-XAGO
YT37A2-XAGO