Human Generated Data

Title

Untitled (boy scouts)

Date

1940

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1999

Human Generated Data

Title

Untitled (boy scouts)

People

Artist: Hamblin Studio, American active 1930s

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1999

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.5
Person 99.4
People 96.4
Apparel 94.7
Clothing 94.7
Family 91.7
Person 90.3
Person 82.4
Robe 62.9
Fashion 62.9
Female 61.5
Outdoors 59.3
Shorts 57.8
Photography 57.3
Photo 57.3
Person 43.3

Clarifai
created on 2023-10-25

people 99.9
group together 98.6
group 98.3
adult 97.6
many 96.7
child 96
man 95.4
woman 92.7
wear 91.8
family 91.8
several 90.3
uniform 89.3
leader 87.4
outfit 85.4
boy 84.5
five 84.5
portrait 84
administration 83.6
retro 77.1
four 76

Imagga
created on 2021-12-14

fountain 51
structure 44.4
picket fence 43.5
fence 36
barrier 26.1
old 18.1
people 17.8
obstruction 17.5
black 16.8
snow 16.1
negative 16.1
vintage 15.7
grunge 15.3
park 14
outdoor 13.8
film 13.6
dirty 13.5
happiness 13.3
person 12.1
antique 12.1
man 12.1
women 11.9
art 11.8
retro 11.5
adult 11.1
tree 10.8
kin 10.6
groom 10.5
outdoors 10.4
landscape 10.4
scene 10.4
winter 10.2
water 10
bride 9.6
couple 9.6
weathered 9.5
travel 9.1
portrait 9.1
paint 9
dress 9
texture 9
happy 8.8
sepia 8.7
love 8.7
two 8.5
summer 8.4
dark 8.3
pattern 8.2
danger 8.2
style 8.2
textured 7.9
married 7.7
mask 7.7
walking 7.6
photographic paper 7.6
world 7.5
frame 7.5
fun 7.5
city 7.5
street 7.4
wedding 7.4
design 7.3
girls 7.3
color 7.2
holiday 7.2

Google
created on 2021-12-14

Plant 92.9
Window 88.1
Standing 86.4
Grass 81.5
Adaptation 79.3
Hat 78.1
Vintage clothing 76.8
Monochrome 75.7
Tree 75.4
Picture frame 71.2
Art 70.9
Monochrome photography 70.2
Font 70
Uniform 68.2
Room 67.8
House 67.6
History 66.5
Event 66.1
Visual arts 64.9
Stock photography 61.7

Microsoft
created on 2021-12-14

text 98.3
wedding dress 86.7
person 82.6
clothing 81.7
dress 77.3
bride 76.5
wedding 69.4
woman 61.7
posing 47.9
old 43.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-42
Gender Male, 54.9%
Calm 39.1%
Surprised 28%
Happy 19.7%
Confused 5.6%
Sad 2.7%
Angry 2%
Fear 1.5%
Disgusted 1.3%

AWS Rekognition

Age 36-52
Gender Male, 74.9%
Happy 47%
Calm 46.7%
Sad 2.1%
Angry 1.6%
Disgusted 0.8%
Surprised 0.8%
Confused 0.7%
Fear 0.2%

AWS Rekognition

Age 35-51
Gender Female, 57.5%
Calm 67.3%
Angry 12.9%
Happy 8.8%
Surprised 3.9%
Sad 3.5%
Fear 1.7%
Confused 1%
Disgusted 0.9%

AWS Rekognition

Age 37-55
Gender Female, 56.1%
Calm 84%
Happy 11.7%
Sad 1.6%
Angry 0.9%
Confused 0.7%
Surprised 0.5%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 24-38
Gender Male, 88%
Happy 67.1%
Angry 18%
Surprised 6.3%
Calm 6.1%
Confused 1.3%
Disgusted 0.6%
Sad 0.5%
Fear 0.2%

AWS Rekognition

Age 41-59
Gender Female, 94.3%
Calm 78.4%
Happy 7%
Sad 6.5%
Surprised 4.3%
Confused 1.5%
Fear 0.9%
Angry 0.8%
Disgusted 0.6%

AWS Rekognition

Age 50-68
Gender Female, 83%
Calm 96.3%
Sad 1%
Surprised 0.7%
Happy 0.7%
Angry 0.5%
Confused 0.4%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 48-66
Gender Female, 52.2%
Calm 76.2%
Happy 21.1%
Angry 0.9%
Sad 0.6%
Surprised 0.6%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 36-54
Gender Female, 88.5%
Calm 79.6%
Surprised 11.1%
Happy 3%
Confused 2%
Fear 1.4%
Angry 1.4%
Sad 1%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

paintings art 98.9%