Human Generated Data

Title

Untitled (group standing in town square with gifts of food and supplies)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3438

Human Generated Data

Title

Untitled (group standing in town square with gifts of food and supplies)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3438

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Wheel 99.2
Machine 99.2
Person 93
Clothing 87.2
Apparel 87.2
Car 86.3
Transportation 86.3
Vehicle 86.3
Automobile 86.3
Person 82.2
People 78.4
Person 78.1
Wheel 77.7
Meal 75.9
Food 75.9
Female 70.5
Person 69.1
Urban 68.9
Monitor 65.5
Electronics 65.5
Display 65.5
Screen 65.5
Crowd 65.1
Person 64.6
Face 64
Car 63
Photography 61.8
Photo 61.8
City 61.7
Town 61.7
Building 61.7
Person 61.1
Downtown 57.7
Person 57.4

Clarifai
created on 2023-10-26

people 99.9
many 99.5
group 99
group together 98.3
adult 97.9
man 95.7
child 91.6
woman 90.3
crowd 89.7
wear 87
several 85.7
administration 85.2
outfit 84.1
monochrome 83.5
leader 83.3
home 82.9
furniture 82.7
uniform 81.1
education 81.1
spectator 76.3

Imagga
created on 2022-01-22

cemetery 100
gravestone 33.8
memorial 29.8
stone 22.4
structure 20.3
old 20.2
city 20
grunge 19.6
architecture 18
vintage 15.7
landscape 15.6
sky 14.7
building 13.9
travel 13.4
grungy 13.3
antique 13
scene 12.1
black 11.4
pattern 10.9
urban 10.5
art 10.5
texture 10.4
house 10
industrial 10
transportation 9.9
history 9.8
design 9.6
light 9.4
paint 9.1
dirty 9
retro 9
water 8.7
cityscape 8.5
buildings 8.5
negative 8.5
town 8.3
decoration 8.3
silhouette 8.3
lake 8.2
transport 8.2
aged 8.1
border 8.1
brown 8.1
tower 8.1
snow 7.8
ancient 7.8
space 7.8
dirt 7.6
old fashioned 7.6
bridge 7.6
frame 7.5
tourism 7.4
grain 7.4
rough 7.3
road 7.2
religion 7.2
grass 7.1
paper 7.1
rural 7
sea 7
scenic 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.6
old 88.1
person 68.7
black 68
vintage 60.9
clothing 58.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 88.7%
Calm 74.8%
Happy 21%
Sad 1.5%
Disgusted 1.4%
Confused 0.4%
Fear 0.3%
Angry 0.2%
Surprised 0.2%

AWS Rekognition

Age 28-38
Gender Female, 67.5%
Happy 92.9%
Calm 4.7%
Sad 1%
Surprised 0.4%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 93.1%
Calm 84.3%
Sad 5.7%
Happy 3.9%
Confused 3.3%
Angry 1%
Disgusted 0.9%
Surprised 0.7%
Fear 0.3%

AWS Rekognition

Age 24-34
Gender Male, 66.1%
Calm 26.4%
Fear 26.2%
Surprised 24.5%
Happy 15.4%
Angry 2.6%
Sad 2.3%
Confused 1.9%
Disgusted 0.7%

AWS Rekognition

Age 41-49
Gender Male, 93.6%
Happy 86.9%
Calm 8.9%
Sad 1.8%
Angry 0.6%
Confused 0.6%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Female, 70.7%
Calm 98.4%
Confused 0.7%
Sad 0.6%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.2%
Sad 38.5%
Confused 26.7%
Calm 17.4%
Disgusted 7.4%
Happy 3.9%
Angry 2.5%
Fear 2.1%
Surprised 1.5%

AWS Rekognition

Age 40-48
Gender Female, 92.5%
Calm 82.9%
Happy 5.9%
Sad 4.9%
Surprised 2.5%
Disgusted 1.9%
Angry 0.8%
Confused 0.6%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 72.8%
Calm 59.1%
Sad 32.1%
Surprised 2.9%
Fear 2.4%
Happy 1.2%
Disgusted 1.1%
Angry 0.7%
Confused 0.4%

AWS Rekognition

Age 22-30
Gender Male, 66%
Calm 65.6%
Sad 15.2%
Happy 8.2%
Fear 5.5%
Confused 2.4%
Disgusted 1.5%
Angry 1%
Surprised 0.7%

AWS Rekognition

Age 48-54
Gender Female, 73%
Happy 39.1%
Fear 29.4%
Calm 11.9%
Surprised 6.4%
Disgusted 4.7%
Angry 3.3%
Confused 3%
Sad 2.3%

AWS Rekognition

Age 19-27
Gender Female, 99.6%
Happy 53%
Calm 26.9%
Fear 9.4%
Confused 3.3%
Angry 2.3%
Disgusted 2.1%
Sad 1.7%
Surprised 1.3%

AWS Rekognition

Age 30-40
Gender Male, 95.3%
Happy 52.4%
Calm 22.4%
Confused 9.1%
Sad 8.2%
Surprised 3.3%
Disgusted 2%
Angry 1.8%
Fear 0.8%

AWS Rekognition

Age 23-33
Gender Male, 64.4%
Happy 74.9%
Calm 14.3%
Surprised 4.2%
Sad 2.7%
Angry 1.3%
Confused 0.9%
Disgusted 0.9%
Fear 0.6%

AWS Rekognition

Age 34-42
Gender Male, 98%
Happy 64.6%
Calm 30.1%
Sad 1.4%
Fear 1%
Surprised 1%
Confused 0.9%
Angry 0.6%
Disgusted 0.3%

AWS Rekognition

Age 33-41
Gender Female, 97.6%
Calm 47%
Happy 36.8%
Sad 13.7%
Disgusted 0.7%
Confused 0.7%
Surprised 0.5%
Angry 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.3%
Wheel 99.2%
Car 86.3%

Text analysis

Amazon

MILK
PET
:
100
Carnation
ALE

Google

arnation PET MILK MILK
arnation
PET
MILK