Human Generated Data

Title

Untitled (two women modeling swim wear in indoor rainforest)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15524

Human Generated Data

Title

Untitled (two women modeling swim wear in indoor rainforest)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 98.7
Apparel 95.8
Clothing 95.8
Garden 77.6
Outdoors 77.6
Coat 64.1
Building 61.9
Architecture 59.6
Arbour 56.9

Imagga
created on 2022-03-05

man 37.6
male 33.3
uniform 31.8
person 27
people 21.7
helmet 21.2
clothing 19.7
worker 19.4
adult 19
military uniform 18
hat 17.1
sport 16.4
equipment 15.5
protection 15.4
weapon 15.1
job 15
portrait 14.9
work 14.1
safety 13.8
occupation 13.7
industry 12.8
industrial 12.7
soldier 12.7
outdoors 12.7
engineer 11.9
labor 11.7
military 11.6
outdoor 11.5
player 11.3
construction 10.3
mask 10.2
builder 9.9
covering 9.8
black 9.7
fun 9.7
war 9.6
jacket 9.5
gun 9.4
action 9.3
leisure 9.1
danger 9.1
recreation 9
one 8.9
activity 8.9
handsome 8.9
stage 8.9
yurt 8.8
protective 8.8
guy 8.7
lifestyle 8.7
ballplayer 8.6
building 8.5
athlete 8.5
summer 8.4
structure 8.2
active 8.2
dwelling 7.9
consumer goods 7.9
army 7.8
men 7.7
repair 7.7
hard 7.6
hobby 7.6
power 7.6
dark 7.5
human 7.5
style 7.4
exercise 7.3
factory 7.2
sky 7
season 7

Google
created on 2022-03-05

Rectangle 83
Plant 82.2
Tints and shades 77.4
Art 75.4
Visual arts 68.4
Pattern 64.5
Room 64.5
Vintage clothing 64.4
Glass 63.8
Facade 57.6
Metal 55.3
Square 54.1
Illustration 51.6
History 50.8

Microsoft
created on 2022-03-05

clothing 96.7
person 93.5
text 91.3
building 85.3
smile 73.5
plant 64.4
footwear 58.6
woman 57.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.1%
Happy 98.2%
Surprised 0.7%
Fear 0.4%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%
Calm 0.1%
Sad 0.1%

AWS Rekognition

Age 43-51
Gender Female, 98.1%
Happy 98.3%
Confused 0.3%
Calm 0.3%
Surprised 0.2%
Disgusted 0.2%
Sad 0.2%
Fear 0.2%
Angry 0.1%

Microsoft Cognitive Services

Age 18
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man and a woman standing in front of a building 76.8%
a person standing in front of a building 76.7%
a man and a woman standing in front of a window 67.1%

Text analysis

Amazon

A*2
MAGO
MJIR YTER A*2 MAGO
MJIR
YTER

Google

XAGO
XAGO