Human Generated Data

Title

Untitled (woman standing by plane propeller)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2735

Human Generated Data

Title

Untitled (woman standing by plane propeller)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Clothing 99.6
Apparel 99.6
Person 98.9
Human 98.9
Face 97.7
Pants 97.1
Chair 95.4
Furniture 95.4
Female 95
Smile 92.7
Outdoors 89.9
Airplane 84.8
Aircraft 84.8
Transportation 84.8
Vehicle 84.8
Girl 83.2
Nature 82.2
Denim 80.6
Jeans 80.6
Woman 78.9
Portrait 78.8
Photography 78.8
Photo 78.8
Dress 78.7
Grass 77.2
Plant 77.2
Kid 76.2
Child 76.2
Teen 61.9
Standing 59.7
Door 57.6
Play 57

Imagga
created on 2022-01-16

person 22.8
sky 20.4
propeller 17.3
man 16.8
people 16.7
umbrella 16.6
parasol 15.8
screw 15.3
sport 14.8
lifestyle 14.5
aviator 14.4
fun 14.2
adult 13.6
professional 13.6
smile 13.5
energy 13.4
male 12.8
happy 12.5
portrait 12.3
summer 12.2
outdoors 11.9
happiness 11.7
mechanism 11.5
device 11.5
outdoor 11.5
water 11.3
landscape 11.2
freedom 11
vacation 10.6
pretty 10.5
attractive 10.5
wave 10.4
wind 10.3
black 10.2
mechanical device 10.2
light 10.1
smiling 10.1
horizon 9.9
grass 9.5
travel 9.1
fashion 9
technology 8.9
sun 8.9
turbine 8.8
beach 8.7
face 8.5
modern 8.4
leisure 8.3
sand 8.3
air 8.3
silhouette 8.3
human 8.2
weapon 8.2
one 8.2
laptop 8.2
exercise 8.2
style 8.2
sunset 8.1
business 7.9
day 7.8
sea 7.8
color 7.8
model 7.8
clouds 7.6
hand 7.6
elegance 7.6
cheerful 7.3
lady 7.3
success 7.2
sexy 7.2
fitness 7.2
suit 7.2
body 7.2
active 7.2
job 7.1
work 7.1

Microsoft
created on 2022-01-16

ground 96.4
text 92.6
person 90.6
outdoor 88.3
posing 82.1
black and white 77.6
aircraft 61
arm 24.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 91.7%
Happy 78.9%
Calm 15.5%
Confused 3.4%
Sad 0.9%
Disgusted 0.6%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Airplane 84.8%

Captions

Microsoft

a vintage photo of a young man holding a baseball bat 34.5%
a vintage photo of a man holding a baseball bat 34.4%
a vintage photo of a man 34.3%

Text analysis

Amazon

YT33A2
YT33A2 ИЛМТ2АЗ
NAGOY
our
ИЛМТ2АЗ

Google

VT A 2. JAAMT2A
VT
A
2.
JAAMT2A