Human Generated Data

Title

Paris, July 14, 1951

Date

1951

People

Artist: Elliot Erwitt, American 1928 - 2023

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. George A. Violin, P1995.52

Human Generated Data

Title

Paris, July 14, 1951

People

Artist: Elliot Erwitt, American 1928 - 2023

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. George A. Violin, P1995.52

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Person 98.9
Person 98.6
Person 97.9
Person 97.2
Person 96
Person 95.6
Person 95.1
Person 93.3
Clothing 92
Apparel 92
Shoe 76.7
Footwear 76.7
Person 75.8
Coat 72.2
Shoe 70.4
People 69.2
Chair 66.6
Furniture 66.6
Military 57.2
Military Uniform 57.1
Transportation 56.7
Vehicle 56.1
Porch 55.5
Person 50.6

Clarifai
created on 2023-10-25

people 99.9
woman 99.2
child 99.1
group 98.7
elderly 98.4
group together 98
man 97.3
adult 95.5
family 94.6
several 91.7
many 91.5
leader 91.4
wedding 90.9
four 90.7
street 89.4
home 88.5
offspring 88
administration 87.4
recreation 85.9
three 85.4

Imagga
created on 2022-01-08

pedestrian 32.3
people 24
walking 23.7
man 21.5
walk 20
uniform 17.9
clothing 17.7
street 16.6
male 16.4
military uniform 15.9
old 14.6
person 14.4
park 14
couple 13.9
road 13.5
city 13.3
tree 12.4
outdoor 12.2
child 12
path 11.3
travel 11.3
day 11
snow 10.8
building 10.8
tourist 10.3
men 10.3
nurse 10.2
family 9.8
landscape 9.7
hiking 9.6
spring 9.4
architecture 9.4
senior 9.4
backpack 9
summer 9
grass 8.7
tourism 8.2
covering 8
coat 8
adult 7.9
back 7.3
consumer goods 7.3
outdoors 7.1
mountain 7.1
trees 7.1
together 7

Google
created on 2022-01-08

Motor vehicle 90.2
Tree 89.4
Vehicle 81.8
Tints and shades 76.9
Plant 74.8
Suit 74.8
Vintage clothing 74.7
Wheel 73.8
Classic 71.1
Event 70.7
Monochrome photography 70.5
Monochrome 67.3
Tire 65.7
History 63.9
Pole 62.5
Room 61.9
Sitting 58.8
Retro style 54.6
Baby Products 52.4
Formal wear 52

Microsoft
created on 2022-01-08

outdoor 99.7
tree 99.3
person 97.2
clothing 95.5
text 85
footwear 81.7
posing 78.8
people 59.6
group 59.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 100%
Calm 98.1%
Sad 0.7%
Disgusted 0.4%
Angry 0.3%
Surprised 0.2%
Confused 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Female, 100%
Happy 99.8%
Surprised 0%
Calm 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%
Sad 0%

AWS Rekognition

Age 26-36
Gender Female, 100%
Happy 98.7%
Confused 0.3%
Surprised 0.3%
Sad 0.2%
Disgusted 0.2%
Angry 0.2%
Calm 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 93.2%
Surprised 2.4%
Angry 2.1%
Disgusted 0.7%
Fear 0.6%
Calm 0.5%
Confused 0.3%
Sad 0.3%

AWS Rekognition

Age 54-64
Gender Female, 99.3%
Happy 95.5%
Calm 1.4%
Surprised 0.8%
Fear 0.7%
Disgusted 0.5%
Sad 0.4%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 59-67
Gender Female, 97.1%
Happy 92.1%
Calm 3.3%
Fear 1.2%
Surprised 0.8%
Sad 0.7%
Angry 0.7%
Confused 0.7%
Disgusted 0.6%

AWS Rekognition

Age 54-64
Gender Male, 98.2%
Angry 97.4%
Sad 2.3%
Fear 0.1%
Calm 0.1%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 34-42
Gender Female, 89.3%
Happy 99.8%
Fear 0%
Surprised 0%
Disgusted 0%
Sad 0%
Angry 0%
Calm 0%
Confused 0%

AWS Rekognition

Age 54-62
Gender Male, 99.8%
Calm 97.6%
Angry 0.6%
Happy 0.5%
Confused 0.4%
Sad 0.3%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 62-72
Gender Male, 83.4%
Calm 97.6%
Disgusted 0.5%
Happy 0.5%
Confused 0.5%
Sad 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 96.2%
Sad 1.6%
Angry 0.9%
Disgusted 0.6%
Confused 0.3%
Surprised 0.2%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 76.7%
Coat 72.2%