Human Generated Data

Title

Untitled (woman in back of car, opening legs and not wearing underwear)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19457

Human Generated Data

Title

Untitled (woman in back of car, opening legs and not wearing underwear)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19457

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 94.3
Human 94.3
Clothing 90.4
Apparel 90.4
Outdoors 68.9
Transportation 67.9
Machine 67.7
Vehicle 66.9
Tire 63
Wheel 59.5
Shorts 59.2

Clarifai
created on 2023-10-22

people 99.9
one 98.4
adult 98.1
monochrome 98.1
portrait 97.6
vehicle 97.5
man 96
transportation system 95.4
child 95.3
street 95
two 94
watercraft 92.8
boy 90.9
aircraft 83
wear 82.6
facial expression 82.4
war 82.3
sit 82
recreation 82
vehicle window 81.1

Imagga
created on 2022-03-05

passenger 31.1
car 27.3
man 22.2
adult 22
person 21.9
people 21.2
sexy 18.5
vehicle 17.9
model 16.3
attractive 16.1
body 16
conveyance 15.1
male 15
happy 13.8
sitting 13.7
driver 13.6
portrait 12.9
face 12.8
motor vehicle 12.7
one 12.7
automobile 12.4
posing 12.4
fashion 12.1
human 12
pretty 11.9
women 11.9
lifestyle 11.6
auto 11.5
smile 11.4
looking 11.2
sensuality 10.9
black 10.8
transportation 10.8
ambulance 10.4
stretcher 10.3
clothing 10.1
blond 10
bus 10
driving 9.7
hair 9.5
youth 9.4
relaxation 9.2
nice 9.2
public transport 9.1
lady 8.9
couple 8.7
erotic 8.7
sexual 8.7
tramway 8.6
drive 8.5
travel 8.4
hand 8.4
slim 8.3
vintage 8.3
outdoors 8.2
sensual 8.2
smiling 8
brunette 7.8
happiness 7.8
hands 7.8
men 7.7
old 7.7
head 7.6
litter 7.5
world 7.4
holding 7.4
vacation 7.4
child 7.4
girls 7.3
sunglasses 7.2
day 7.1

Microsoft
created on 2022-03-05

outdoor 95.5
person 95.1
man 91.9
transport 84.1
text 77.3
black and white 69.2
vehicle 61.4
old 43.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 63.5%
Happy 61.1%
Fear 11.1%
Sad 10.6%
Surprised 7%
Angry 4.2%
Calm 2.5%
Disgusted 2%
Confused 1.4%

Feature analysis

Amazon

Person
Person 94.3%

Captions

Microsoft
created on 2022-03-05

a man riding on the back of a truck 39.6%

Text analysis

Amazon

TERMINA
PLING TERMINA
PLING
-
- -
a
LAMT2A3
4000
4000 adidas
CRUS
adidas

Google

KING TERNINA WARSEO
KING
TERNINA
WARSEO