Human Generated Data

Title

Untitled (stretcher, female accident victim)

Date

c. 1970, from 1954 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18485

Human Generated Data

Title

Untitled (stretcher, female accident victim)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c. 1970, from 1954 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.3
Human 99.3
Person 99.1
Person 89
Furniture 82.6
Apparel 77.3
Clothing 77.3
Person 74.8
Chair 74.7
Tire 74.5
Machine 74.3
Spoke 74.3
Person 71.9
Alloy Wheel 71.6
Wheel 71.6
Transportation 68.8
Vehicle 68.1
Bench 65.6
Car 64.6
Automobile 64.6
Wood 64.4
Shorts 59.5
Shoe 55.8
Footwear 55.8
Car Wheel 55.5
Finger 55.5

Imagga
created on 2022-02-25

stretcher 100
litter 100
conveyance 100
chair 33.3
seat 22.4
interior 19.4
man 17.5
people 16.2
furniture 15.8
equipment 15.6
male 15.6
office 14.4
work 14.1
medical 14.1
health 13.9
sitting 13.7
room 12.8
person 12.7
empty 12
care 11.5
transportation 10.7
adult 10.3
business 10.3
transport 10
indoors 9.7
exam 9.6
lifestyle 9.4
car 9.2
hospital 9
metal 8.8
working 8.8
medicine 8.8
automobile 8.6
modern 8.4
inside 8.3
patient 8.2
home 8
job 8
travel 7.7
clinic 7.7
vehicle 7.6
hand 7.6
relax 7.6
store 7.5
wheel 7.5
doctor 7.5
clean 7.5
support 7.5
water 7.3
wheelchair 7.3
black 7.2

Google
created on 2022-02-25

Furniture 94
White 92.2
Leg 91.2
Black 89.5
Style 83.9
Black-and-white 83.5
Motor vehicle 81.7
Comfort 81.3
Chair 78.3
Wood 77.8
Tints and shades 77.3
Human leg 73.7
Shorts 73
Monochrome photography 72.2
Classic 71.8
Monochrome 71.5
Flooring 70.7
Vintage clothing 69.1
Sitting 64.9
Bag 63.6

Microsoft
created on 2022-02-25

person 94.6
footwear 90.3
clothing 87.6
text 77.1
luggage and bags 60.1
old 53.3

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Female, 100%
Calm 85.1%
Confused 8.6%
Sad 1.8%
Angry 1.5%
Surprised 1.3%
Disgusted 1%
Fear 0.5%
Happy 0.3%

Feature analysis

Amazon

Person 99.3%
Chair 74.7%
Bench 65.6%
Car 64.6%
Shoe 55.8%

Captions

Microsoft

a group of people standing next to a car 38.7%
a group of people standing on top of a car 29.1%
a group of people standing on top of a suitcase 29%

Text analysis

Amazon

YT3
all