Human Generated Data

Title

Untitled (group of women sitting on lawn)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19343

Human Generated Data

Title

Untitled (group of women sitting on lawn)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19343

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Yard 100
Outdoors 100
Nature 100
Grass 100
Plant 100
Shelter 100
Building 100
Countryside 100
Rural 100
Dress 99.7
Clothing 99.7
Apparel 99.7
Person 99.5
Human 99.5
Face 99.2
Person 98.9
Person 98.8
Female 98.7
Person 98.6
Person 98
Housing 97.7
Person 97.2
Blonde 96.7
Woman 96.7
Kid 96.7
Girl 96.7
Teen 96.7
Child 96.7
Person 96.6
Smile 96.3
Person 95.8
Backyard 95.1
Neighborhood 92.9
Urban 92.9
Chair 91.6
Furniture 91.6
Park 89.4
Lawn 89.4
Person 86.7
House 85.3
Person 82.4
Portrait 81.9
Photography 81.9
Photo 81.9
Person 77.5
Villa 74.5
Costume 73.3
People 73.2
Tree 71.6
Suit 70
Coat 70
Overcoat 70
Leisure Activities 64.4
Meal 63.9
Food 63.9
Vacation 62.3
Man 61.2
Shorts 59.4
Skirt 59
Field 58.4
Play 58.2
Baby 56.6
Grassland 55.7

Clarifai
created on 2023-10-22

people 99.9
child 99.5
group together 99.3
wear 98.9
group 98.7
boy 97.2
adult 96.9
veil 96.3
woman 95.8
many 95.6
education 94.2
man 94.1
family 93
several 92.6
school 92.3
outfit 90.1
baseball 86.4
adolescent 85.6
recreation 84.5
five 84

Imagga
created on 2022-03-05

football helmet 62
helmet 56
ball 51.4
headdress 38.1
sport 34.9
grass 30
man 29.6
soccer ball 29.4
game 28.5
competition 28.4
football 26.9
field 26.8
clothing 26.1
play 25
outdoors 23.9
people 22.9
active 22.5
soccer 22.1
equipment 22.1
team 21.5
male 19.9
player 19.9
kick 19.5
playing 18.2
family 17.8
athlete 17.8
game equipment 17.7
person 17.1
men 16.3
activity 16.1
outdoor 16
fun 15.7
park 15.6
child 15.5
outside 15.4
players 14.8
practice 14.5
lifestyle 14.4
leisure 14.1
action 13.9
sports 13.9
exercise 13.6
goal 13.4
children 12.8
consumer goods 12.5
covering 12.5
boy 12.2
recreation 11.6
run 11.6
happy 11.3
adult 11.2
fitness 10.8
league 10.8
golf 10.5
foot 10.5
summer 10.3
youth 10.2
smiling 10.1
together 9.6
father 9.6
running 9.6
athletic 9.6
couple 9.6
day 9.4
athletics 8.8
world 8.8
relax 8.4
portrait 8.4
brass 8.4
training 8.3
holding 8.3
protection 8.2
group 8.1
kid 8
practicing 7.9
smile 7.8
happiness 7.8
standing 7.8
pass 7.8
match 7.7
balls 7.7
joy 7.5
teamwork 7.4
teenager 7.3
girls 7.3
pedestrian 7.2
professional 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

building 99.8
outdoor 98.4
sport 83.9
person 81.2
text 80.8
black and white 76.4
clothing 76.2
house 73.7
posing 35.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 69.2%
Calm 48.3%
Happy 48.1%
Surprised 2.2%
Confused 0.6%
Disgusted 0.3%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Male, 91.2%
Sad 46.9%
Calm 25.2%
Happy 18.5%
Fear 3.1%
Surprised 2.7%
Disgusted 1.7%
Angry 1.1%
Confused 0.7%

AWS Rekognition

Age 34-42
Gender Male, 81.6%
Calm 88%
Happy 10.5%
Surprised 0.7%
Confused 0.4%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 43-51
Gender Male, 99%
Calm 59%
Happy 36.9%
Confused 1.3%
Disgusted 0.9%
Sad 0.8%
Surprised 0.5%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 35-43
Gender Male, 51.5%
Calm 90.4%
Disgusted 6.7%
Confused 0.8%
Sad 0.7%
Happy 0.5%
Surprised 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 47-53
Gender Female, 97%
Calm 84.7%
Happy 13.4%
Fear 0.5%
Sad 0.4%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 51-59
Gender Female, 83.4%
Happy 36.4%
Surprised 33.7%
Calm 12.7%
Disgusted 5.7%
Confused 4.2%
Sad 2.7%
Angry 2.5%
Fear 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 98.9%
Person 98.8%
Person 98.6%
Person 98%
Person 97.2%
Person 96.6%
Person 95.8%
Person 86.7%
Person 82.4%
Person 77.5%

Text analysis

Amazon

23
pg
199
100
KODAKSLA